Hi guys, after few years of working via spark livy I wanted to test out spark via Yarn in cluster mode
I`m using HDP 2.6.5.x version of hadoop with spark 2.3.0, my HDP is kerberized I builded zeppelin from master branch and updated my interpreter settings (see the attachment) after running any spark paragraph I`m getting org.apache.zeppelin.interpreter.InterpreterException: org.apache.zeppelin.interpreter.InterpreterException: Fail to open SparkInterpreter ... Caused by: scala.reflect.internal.MissingRequirementError: object java.lang.Object in compiler mirror not found. I also attached logs from yarn and zeppelin to keep this mail Also, when I switch from cluster mode to client mode, everything works as expected... If you have any idea, what I did wrong configuring this, please let me know as I`m getting super frustrated here :( Thanks a million best regards Miso
SPARK_HOME /usr/hdp/2.6.5.0-292/spark2 master yarn spark.app.name test_drive spark.driver.cores 1 spark.driver.memory 1g spark.executor.cores 1 spark.executor.memory 1g spark.files spark.jars spark.jars.packages zeppelin.spark.useHiveContext true zeppelin.spark.printREPLOutput true zeppelin.spark.maxResult 1000 zeppelin.spark.enableSupportedVersionCheck true zeppelin.spark.uiWebUrl zeppelin.spark.ui.hidden false spark.webui.yarn.useProxy false zeppelin.spark.scala.color true zeppelin.spark.deprecatedMsg.show true zeppelin.spark.concurrentSQL false zeppelin.spark.concurrentSQL.max 10 zeppelin.spark.sql.stacktrace false zeppelin.spark.sql.interpolation false PYSPARK_PYTHON python PYSPARK_DRIVER_PYTHON python zeppelin.pyspark.useIPython true zeppelin.R.knitr true zeppelin.R.cmd R zeppelin.R.image.width 100% zeppelin.R.render.options out.format = 'html', comment = NA, echo = FALSE, results = 'asis', message = F, warning = F, fig.retina = 2 zeppelin.kotlin.shortenTypes true HADOOP_CONF_DIR /etc/hadoop/conf spark.submit.deployMode cluster SPARK_APP_JAR /usr/hdp/2.6.5.0-292/spark2/jars JAVA_HOME /usr/lib/jvm/java-8-oracle/jre/bin/java
INFO [2020-04-01 14:52:00,389] ({main} ZeppelinConfiguration.java[create]:163) - Load configuration from file:/opt/zeppelin-2020.03.31/conf/zeppelin-site.xml INFO [2020-04-01 14:52:00,459] ({main} ZeppelinConfiguration.java[create]:171) - Server Host: 0.0.0.0 INFO [2020-04-01 14:52:00,459] ({main} ZeppelinConfiguration.java[create]:173) - Server Port: 9995 INFO [2020-04-01 14:52:00,459] ({main} ZeppelinConfiguration.java[create]:177) - Context Path: / INFO [2020-04-01 14:52:00,459] ({main} ZeppelinConfiguration.java[create]:178) - Zeppelin Version: 0.9.0-SNAPSHOT INFO [2020-04-01 14:52:00,477] ({main} Log.java[initialized]:193) - Logging initialized @461ms to org.eclipse.jetty.util.log.Slf4jLog WARN [2020-04-01 14:52:00,728] ({main} ZeppelinConfiguration.java[getConfigFSDir]:631) - zeppelin.config.fs.dir is not specified, fall back to local conf directory zeppelin.conf.dir INFO [2020-04-01 14:52:00,756] ({main} Credentials.java[loadFromFile]:121) - /opt/zeppelin-server/conf/credentials.json INFO [2020-04-01 14:52:00,784] ({ImmediateThread-1585745520724} PluginManager.java[loadNotebookRepo]:60) - Loading NotebookRepo Plugin: org.apache.zeppelin.notebook.repo.VFSNotebookRepo INFO [2020-04-01 14:52:00,785] ({ImmediateThread-1585745520724} VFSNotebookRepo.java[setNotebookDirectory]:70) - Using notebookDir: /opt/zeppelin-server/notebook INFO [2020-04-01 14:52:00,827] ({main} ZeppelinServer.java[setupWebAppContext]:488) - warPath is: /opt/zeppelin-server/zeppelin-web-0.9.0-SNAPSHOT.war INFO [2020-04-01 14:52:00,827] ({main} ZeppelinServer.java[setupWebAppContext]:501) - ZeppelinServer Webapp path: /opt/zeppelin-server/webapps INFO [2020-04-01 14:52:00,849] ({main} ZeppelinServer.java[setupWebAppContext]:488) - warPath is: INFO [2020-04-01 14:52:00,849] ({main} ZeppelinServer.java[setupWebAppContext]:501) - ZeppelinServer Webapp path: /opt/zeppelin-server/webapps/next INFO [2020-04-01 14:52:00,883] ({main} NotebookServer.java[<init>]:153) - NotebookServer instantiated: org.apache.zeppelin.socket.NotebookServer@7748410a INFO [2020-04-01 14:52:00,884] ({main} NotebookServer.java[setNotebook]:164) - Injected NotebookProvider INFO [2020-04-01 14:52:00,885] ({main} NotebookServer.java[setServiceLocator]:158) - Injected ServiceLocator: ServiceLocatorImpl(shared-locator,0,1884122755) INFO [2020-04-01 14:52:00,885] ({main} NotebookServer.java[setNotebookService]:171) - Injected NotebookServiceProvider INFO [2020-04-01 14:52:00,886] ({main} NotebookServer.java[setAuthorizationServiceProvider]:178) - Injected NotebookAuthorizationServiceProvider INFO [2020-04-01 14:52:00,886] ({main} NotebookServer.java[setConnectionManagerProvider]:184) - Injected ConnectionManagerProvider INFO [2020-04-01 14:52:00,886] ({main} ZeppelinServer.java[setupClusterManagerServer]:439) - Cluster mode is disabled INFO [2020-04-01 14:52:00,892] ({main} ZeppelinServer.java[main]:251) - Starting zeppelin server INFO [2020-04-01 14:52:00,893] ({main} Server.java[doStart]:370) - jetty-9.4.18.v20190429; built: 2019-04-29T20:42:08.989Z; git: e1bc35120a6617ee3df052294e433f3a25ce7097; jvm 1.8.0_201-b09 INFO [2020-04-01 14:52:00,969] ({main} StandardDescriptorProcessor.java[visitServlet]:283) - NO JSP Support for /, did not find org.eclipse.jetty.jsp.JettyJspServlet INFO [2020-04-01 14:52:00,982] ({main} DefaultSessionIdManager.java[doStart]:365) - DefaultSessionIdManager workerName=node0 INFO [2020-04-01 14:52:00,982] ({main} DefaultSessionIdManager.java[doStart]:370) - No SessionScavenger set, using defaults INFO [2020-04-01 14:52:00,984] ({main} HouseKeeper.java[startScavenging]:149) - node0 Scavenging every 600000ms INFO [2020-04-01 14:52:00,992] ({main} ContextHandler.java[log]:2349) - Initializing Shiro environment INFO [2020-04-01 14:52:00,992] ({main} EnvironmentLoader.java[initEnvironment]:133) - Starting Shiro environment initialization. INFO [2020-04-01 14:52:01,367] ({main} IniRealm.java[processDefinitions]:188) - IniRealm defined, but there is no [users] section defined. This realm will not be populated with any users and it is assumed that they will be populated programatically. Users must be defined for this Realm instance to be useful. INFO [2020-04-01 14:52:01,371] ({main} EnvironmentLoader.java[initEnvironment]:147) - Shiro environment initialized in 379 ms. INFO [2020-04-01 14:52:02,231] ({main} ContextHandler.java[doStart]:855) - Started o.e.j.w.WebAppContext@6150c3ec{zeppelin-web,/,jar:file:///opt/zeppelin-2020.03.31/zeppelin-web-0.9.0-SNAPSHOT.war!/,AVAILABLE}{/opt/zeppelin-server/zeppelin-web-0.9.0-SNAPSHOT.war} INFO [2020-04-01 14:52:02,242] ({main} StandardDescriptorProcessor.java[visitServlet]:283) - NO JSP Support for /next, did not find org.eclipse.jetty.jsp.JettyJspServlet INFO [2020-04-01 14:52:02,243] ({main} ContextHandler.java[log]:2349) - Initializing Shiro environment INFO [2020-04-01 14:52:02,243] ({main} EnvironmentLoader.java[initEnvironment]:133) - Starting Shiro environment initialization. INFO [2020-04-01 14:52:02,250] ({main} IniRealm.java[processDefinitions]:188) - IniRealm defined, but there is no [users] section defined. This realm will not be populated with any users and it is assumed that they will be populated programatically. Users must be defined for this Realm instance to be useful. INFO [2020-04-01 14:52:02,251] ({main} EnvironmentLoader.java[initEnvironment]:147) - Shiro environment initialized in 8 ms. INFO [2020-04-01 14:52:02,355] ({main} ContextHandler.java[doStart]:855) - Started o.e.j.w.WebAppContext@229f66ed{/next,file:///,AVAILABLE}{/} INFO [2020-04-01 14:52:02,369] ({main} AbstractConnector.java[doStart]:292) - Started ServerConnector@704a52ec{HTTP/1.1,[http/1.1]}{0.0.0.0:9995} INFO [2020-04-01 14:52:02,369] ({main} Server.java[doStart]:410) - Started @2355ms INFO [2020-04-01 14:52:04,615] ({qtp701141022-14} NotebookServer.java[onOpen]:243) - New connection from 10.100.212.124:54972 WARN [2020-04-01 14:52:04,661] ({qtp701141022-14} ZeppelinConfiguration.java[getConfigFSDir]:631) - zeppelin.config.fs.dir is not specified, fall back to local conf directory zeppelin.conf.dir WARN [2020-04-01 14:52:04,661] ({qtp701141022-14} ZeppelinConfiguration.java[getConfigFSDir]:631) - zeppelin.config.fs.dir is not specified, fall back to local conf directory zeppelin.conf.dir WARN [2020-04-01 14:52:04,662] ({qtp701141022-14} ZeppelinConfiguration.java[getConfigFSDir]:631) - zeppelin.config.fs.dir is not specified, fall back to local conf directory zeppelin.conf.dir WARN [2020-04-01 14:52:04,662] ({qtp701141022-14} LocalConfigStorage.java[loadNotebookAuthorization]:83) - NotebookAuthorization file /opt/zeppelin-server/conf/notebook-authorization.json is not existed INFO [2020-04-01 14:52:07,369] ({main} ZeppelinServer.java[main]:265) - Done, zeppelin server started INFO [2020-04-01 14:52:08,462] ({qtp701141022-13} AbstractValidatingSessionManager.java[enableSessionValidation]:233) - Enabling session validation scheduler... INFO [2020-04-01 14:52:08,631] ({qtp701141022-13} InterpreterSettingManager.java[<init>]:186) - Using RecoveryStorage: org.apache.zeppelin.interpreter.recovery.NullRecoveryStorage INFO [2020-04-01 14:52:08,632] ({qtp701141022-13} InterpreterSettingManager.java[<init>]:192) - Using LifecycleManager: org.apache.zeppelin.interpreter.lifecycle.NullLifecycleManager INFO [2020-04-01 14:52:08,641] ({Thread-15} RemoteInterpreterEventServer.java[run]:104) - InterpreterEventServer is starting at 192.168.200.65:38198 INFO [2020-04-01 14:52:09,135] ({qtp701141022-13} RemoteInterpreterEventServer.java[start]:128) - RemoteInterpreterEventServer is started INFO [2020-04-01 14:52:09,163] ({qtp701141022-13} InterpreterSettingManager.java[registerInterpreterSetting]:463) - Register InterpreterSettingTemplate: spark INFO [2020-04-01 14:52:09,168] ({qtp701141022-13} InterpreterSettingManager.java[registerInterpreterSetting]:463) - Register InterpreterSettingTemplate: java INFO [2020-04-01 14:52:09,176] ({qtp701141022-13} InterpreterSettingManager.java[registerInterpreterSetting]:463) - Register InterpreterSettingTemplate: submarine INFO [2020-04-01 14:52:09,178] ({qtp701141022-13} InterpreterSettingManager.java[registerInterpreterSetting]:463) - Register InterpreterSettingTemplate: jupyter INFO [2020-04-01 14:52:09,181] ({qtp701141022-13} InterpreterSettingManager.java[registerInterpreterSetting]:463) - Register InterpreterSettingTemplate: python INFO [2020-04-01 14:52:09,183] ({qtp701141022-13} InterpreterSettingManager.java[registerInterpreterSetting]:463) - Register InterpreterSettingTemplate: angular INFO [2020-04-01 14:52:09,185] ({qtp701141022-13} InterpreterSettingManager.java[registerInterpreterSetting]:463) - Register InterpreterSettingTemplate: file INFO [2020-04-01 14:52:09,187] ({qtp701141022-13} InterpreterSettingManager.java[registerInterpreterSetting]:463) - Register InterpreterSettingTemplate: md INFO [2020-04-01 14:52:09,190] ({qtp701141022-13} InterpreterSettingManager.java[registerInterpreterSetting]:463) - Register InterpreterSettingTemplate: livy INFO [2020-04-01 14:52:09,192] ({qtp701141022-13} InterpreterSettingManager.java[registerInterpreterSetting]:463) - Register InterpreterSettingTemplate: groovy INFO [2020-04-01 14:52:09,193] ({qtp701141022-13} InterpreterSettingManager.java[registerInterpreterSetting]:463) - Register InterpreterSettingTemplate: sh INFO [2020-04-01 14:52:09,195] ({qtp701141022-13} InterpreterSettingManager.java[registerInterpreterSetting]:463) - Register InterpreterSettingTemplate: jdbc INFO [2020-04-01 14:52:09,197] ({qtp701141022-13} InterpreterSettingManager.java[registerInterpreterSetting]:463) - Register InterpreterSettingTemplate: kotlin INFO [2020-04-01 14:52:09,199] ({qtp701141022-13} InterpreterSettingManager.java[registerInterpreterSetting]:463) - Register InterpreterSettingTemplate: r INFO [2020-04-01 14:52:09,200] ({qtp701141022-13} LocalConfigStorage.java[loadInterpreterSettings]:69) - Load Interpreter Setting from file: /opt/zeppelin-server/conf/interpreter.json INFO [2020-04-01 14:52:09,238] ({qtp701141022-13} InterpreterSettingManager.java[loadFromFile]:284) - Create Interpreter Setting python from interpreter.json INFO [2020-04-01 14:52:09,238] ({qtp701141022-13} InterpreterSettingManager.java[loadFromFile]:284) - Create Interpreter Setting kotlin from interpreter.json INFO [2020-04-01 14:52:09,238] ({qtp701141022-13} InterpreterSettingManager.java[loadFromFile]:284) - Create Interpreter Setting jdbc from interpreter.json INFO [2020-04-01 14:52:09,239] ({qtp701141022-13} InterpreterSettingManager.java[loadFromFile]:284) - Create Interpreter Setting angular from interpreter.json INFO [2020-04-01 14:52:09,239] ({qtp701141022-13} InterpreterSettingManager.java[loadFromFile]:284) - Create Interpreter Setting livy from interpreter.json INFO [2020-04-01 14:52:09,239] ({qtp701141022-13} InterpreterSettingManager.java[loadFromFile]:284) - Create Interpreter Setting r from interpreter.json INFO [2020-04-01 14:52:09,240] ({qtp701141022-13} InterpreterSettingManager.java[loadFromFile]:284) - Create Interpreter Setting java from interpreter.json INFO [2020-04-01 14:52:09,240] ({qtp701141022-13} InterpreterSettingManager.java[loadFromFile]:284) - Create Interpreter Setting jupyter from interpreter.json INFO [2020-04-01 14:52:09,240] ({qtp701141022-13} InterpreterSettingManager.java[loadFromFile]:284) - Create Interpreter Setting file from interpreter.json INFO [2020-04-01 14:52:09,240] ({qtp701141022-13} InterpreterSettingManager.java[loadFromFile]:284) - Create Interpreter Setting groovy from interpreter.json INFO [2020-04-01 14:52:09,241] ({qtp701141022-13} InterpreterSettingManager.java[loadFromFile]:284) - Create Interpreter Setting submarine from interpreter.json INFO [2020-04-01 14:52:09,241] ({qtp701141022-13} InterpreterSettingManager.java[loadFromFile]:284) - Create Interpreter Setting spark from interpreter.json INFO [2020-04-01 14:52:09,242] ({qtp701141022-13} InterpreterSettingManager.java[loadFromFile]:284) - Create Interpreter Setting sh from interpreter.json INFO [2020-04-01 14:52:09,242] ({qtp701141022-13} InterpreterSettingManager.java[loadFromFile]:284) - Create Interpreter Setting md from interpreter.json INFO [2020-04-01 14:52:09,260] ({qtp701141022-13} InterpreterSetting.java[setStatus]:740) - Set interpreter python status to DOWNLOADING_DEPENDENCIES INFO [2020-04-01 14:52:09,261] ({qtp701141022-13} InterpreterSetting.java[setStatus]:740) - Set interpreter kotlin status to DOWNLOADING_DEPENDENCIES INFO [2020-04-01 14:52:09,261] ({Thread-16} InterpreterSetting.java[setStatus]:740) - Set interpreter python status to READY INFO [2020-04-01 14:52:09,262] ({qtp701141022-13} InterpreterSetting.java[setStatus]:740) - Set interpreter jdbc status to DOWNLOADING_DEPENDENCIES INFO [2020-04-01 14:52:09,262] ({Thread-17} InterpreterSetting.java[setStatus]:740) - Set interpreter kotlin status to READY INFO [2020-04-01 14:52:09,262] ({qtp701141022-13} InterpreterSetting.java[setStatus]:740) - Set interpreter angular status to DOWNLOADING_DEPENDENCIES INFO [2020-04-01 14:52:09,262] ({Thread-18} InterpreterSetting.java[setStatus]:740) - Set interpreter jdbc status to READY INFO [2020-04-01 14:52:09,263] ({qtp701141022-13} InterpreterSetting.java[setStatus]:740) - Set interpreter livy status to DOWNLOADING_DEPENDENCIES INFO [2020-04-01 14:52:09,263] ({Thread-19} InterpreterSetting.java[setStatus]:740) - Set interpreter angular status to READY INFO [2020-04-01 14:52:09,264] ({qtp701141022-13} InterpreterSetting.java[setStatus]:740) - Set interpreter r status to DOWNLOADING_DEPENDENCIES INFO [2020-04-01 14:52:09,285] ({Thread-20} InterpreterSetting.java[setStatus]:740) - Set interpreter livy status to READY INFO [2020-04-01 14:52:09,285] ({Thread-21} InterpreterSetting.java[setStatus]:740) - Set interpreter r status to READY INFO [2020-04-01 14:52:09,285] ({qtp701141022-13} InterpreterSetting.java[setStatus]:740) - Set interpreter java status to DOWNLOADING_DEPENDENCIES INFO [2020-04-01 14:52:09,287] ({qtp701141022-13} InterpreterSetting.java[setStatus]:740) - Set interpreter jupyter status to DOWNLOADING_DEPENDENCIES INFO [2020-04-01 14:52:09,287] ({Thread-22} InterpreterSetting.java[setStatus]:740) - Set interpreter java status to READY INFO [2020-04-01 14:52:09,288] ({Thread-23} InterpreterSetting.java[setStatus]:740) - Set interpreter jupyter status to READY INFO [2020-04-01 14:52:09,288] ({qtp701141022-13} InterpreterSetting.java[setStatus]:740) - Set interpreter file status to DOWNLOADING_DEPENDENCIES INFO [2020-04-01 14:52:09,301] ({qtp701141022-13} InterpreterSetting.java[setStatus]:740) - Set interpreter groovy status to DOWNLOADING_DEPENDENCIES INFO [2020-04-01 14:52:09,301] ({Thread-24} InterpreterSetting.java[setStatus]:740) - Set interpreter file status to READY INFO [2020-04-01 14:52:09,301] ({qtp701141022-13} InterpreterSetting.java[setStatus]:740) - Set interpreter submarine status to DOWNLOADING_DEPENDENCIES INFO [2020-04-01 14:52:09,302] ({Thread-25} InterpreterSetting.java[setStatus]:740) - Set interpreter groovy status to READY INFO [2020-04-01 14:52:09,302] ({qtp701141022-13} InterpreterSetting.java[setStatus]:740) - Set interpreter spark status to DOWNLOADING_DEPENDENCIES INFO [2020-04-01 14:52:09,302] ({Thread-26} InterpreterSetting.java[setStatus]:740) - Set interpreter submarine status to READY INFO [2020-04-01 14:52:09,303] ({qtp701141022-13} InterpreterSetting.java[setStatus]:740) - Set interpreter sh status to DOWNLOADING_DEPENDENCIES INFO [2020-04-01 14:52:09,303] ({Thread-27} InterpreterSetting.java[setStatus]:740) - Set interpreter spark status to READY INFO [2020-04-01 14:52:09,304] ({qtp701141022-13} InterpreterSetting.java[setStatus]:740) - Set interpreter md status to DOWNLOADING_DEPENDENCIES INFO [2020-04-01 14:52:09,304] ({Thread-28} InterpreterSetting.java[setStatus]:740) - Set interpreter sh status to READY INFO [2020-04-01 14:52:09,304] ({qtp701141022-13} LocalConfigStorage.java[save]:59) - Save Interpreter Setting to /opt/zeppelin-server/conf/interpreter.json INFO [2020-04-01 14:52:09,304] ({Thread-29} InterpreterSetting.java[setStatus]:740) - Set interpreter md status to READY INFO [2020-04-01 14:52:09,350] ({qtp701141022-13} LuceneSearch.java[<init>]:94) - Use /tmp/zeppelin-index for storing lucene search index INFO [2020-04-01 14:52:09,467] ({qtp701141022-13} ShiroAuthenticationService.java[<init>]:70) - ShiroAuthenticationService is initialized INFO [2020-04-01 14:52:09,628] ({qtp701141022-13} LoginRestApi.java[postLogin]:234) - {"status":"OK","message":"","body":{"principal":"mvince","ticket":"c27568c2-d493-4a94-933f-3b5351bc3210","roles":"[]"}} INFO [2020-04-01 14:52:09,754] ({qtp701141022-15} StdSchedulerFactory.java[instantiate]:1184) - Using default implementation for ThreadExecutor INFO [2020-04-01 14:52:09,757] ({qtp701141022-15} SimpleThreadPool.java[initialize]:268) - Job execution threads will use class loader of thread: qtp701141022-15 INFO [2020-04-01 14:52:09,770] ({qtp701141022-15} SchedulerSignalerImpl.java[<init>]:61) - Initialized Scheduler Signaller of type: class org.quartz.core.SchedulerSignalerImpl INFO [2020-04-01 14:52:09,771] ({qtp701141022-15} QuartzScheduler.java[<init>]:240) - Quartz Scheduler v.2.2.1 created. INFO [2020-04-01 14:52:09,772] ({qtp701141022-15} RAMJobStore.java[initialize]:155) - RAMJobStore initialized. INFO [2020-04-01 14:52:09,773] ({qtp701141022-15} QuartzScheduler.java[initialize]:305) - Scheduler meta-data: Quartz Scheduler (v2.2.1) 'DefaultQuartzScheduler' with instanceId 'NON_CLUSTERED' Scheduler class: 'org.quartz.core.QuartzScheduler' - running locally. NOT STARTED. Currently in standby mode. Number of jobs executed: 0 Using thread pool 'org.quartz.simpl.SimpleThreadPool' - with 10 threads. Using job-store 'org.quartz.simpl.RAMJobStore' - which does not support persistence. and is not clustered. INFO [2020-04-01 14:52:09,774] ({qtp701141022-15} StdSchedulerFactory.java[instantiate]:1339) - Quartz scheduler 'DefaultQuartzScheduler' initialized from default resource file in Quartz package: 'quartz.properties' INFO [2020-04-01 14:52:09,774] ({qtp701141022-15} StdSchedulerFactory.java[instantiate]:1343) - Quartz scheduler version: 2.2.1 INFO [2020-04-01 14:52:09,774] ({qtp701141022-15} QuartzScheduler.java[start]:575) - Scheduler DefaultQuartzScheduler_$_NON_CLUSTERED started. INFO [2020-04-01 14:52:09,775] ({Init CronJob Thread} QuartzSchedulerService.java[lambda$new$1]:65) - Starting init cronjobs INFO [2020-04-01 14:52:09,784] ({qtp701141022-15} Notebook.java[getNotesInfo]:557) - Start getNoteList INFO [2020-04-01 14:52:09,787] ({qtp701141022-15} Notebook.java[getNotesInfo]:581) - Finish getNoteList WARN [2020-04-01 14:52:09,815] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EFNUMC42 because its cron expression is empty. WARN [2020-04-01 14:52:09,847] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EA6SHT18 because its cron expression is empty. WARN [2020-04-01 14:52:09,860] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EGGA1EKB because its cron expression is empty. WARN [2020-04-01 14:52:09,869] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EF24ZWPW because its cron expression is empty. WARN [2020-04-01 14:52:09,874] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2E8UCVQJR because its cron expression is empty. WARN [2020-04-01 14:52:09,883] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EJUT6WUF because its cron expression is empty. WARN [2020-04-01 14:52:09,888] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2E97WF2EQ because its cron expression is empty. WARN [2020-04-01 14:52:09,893] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EP8ERVQK because its cron expression is empty. WARN [2020-04-01 14:52:09,905] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2ENZ4K8U8 because its cron expression is empty. INFO [2020-04-01 14:52:09,920] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:167) - Trigger cron for note: loading_time_all_last, with cron expression: 0 16 10 * * ? WARN [2020-04-01 14:52:09,931] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EZ9G3JJU because its cron expression is empty. WARN [2020-04-01 14:52:09,932] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EB4N5R3F because its cron expression is empty. WARN [2020-04-01 14:52:09,935] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EDQW7GMK because its cron expression is empty. WARN [2020-04-01 14:52:09,950] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2BWJFTXKJ because its cron expression is empty. WARN [2020-04-01 14:52:09,966] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2BWJFTXKM because its cron expression is empty. INFO [2020-04-01 14:52:09,972] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:167) - Trigger cron for note: profile_extended_last, with cron expression: 0 41 9 * * ? WARN [2020-04-01 14:52:09,975] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EPMFU6DA because its cron expression is empty. WARN [2020-04-01 14:52:09,977] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EKF8N63T because its cron expression is empty. WARN [2020-04-01 14:52:09,978] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EFDWHZB1 because its cron expression is empty. WARN [2020-04-01 14:52:09,989] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EDST5HC1 because its cron expression is empty. WARN [2020-04-01 14:52:09,998] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EM5CEYTJ because its cron expression is empty. INFO [2020-04-01 14:52:09,999] ({qtp701141022-19} InterpreterSetting.java[getOrCreateInterpreterGroup]:466) - Create InterpreterGroup with groupId: spark-mvince for user: mvince and note: 2ET5BSCWB INFO [2020-04-01 14:52:10,002] ({qtp701141022-19} InterpreterSetting.java[createInterpreters]:821) - Interpreter org.apache.zeppelin.spark.SparkInterpreter created for user: mvince, sessionId: shared_session INFO [2020-04-01 14:52:10,002] ({qtp701141022-19} InterpreterSetting.java[createInterpreters]:821) - Interpreter org.apache.zeppelin.spark.SparkSqlInterpreter created for user: mvince, sessionId: shared_session INFO [2020-04-01 14:52:10,002] ({qtp701141022-19} InterpreterSetting.java[createInterpreters]:821) - Interpreter org.apache.zeppelin.spark.PySparkInterpreter created for user: mvince, sessionId: shared_session INFO [2020-04-01 14:52:10,002] ({qtp701141022-19} InterpreterSetting.java[createInterpreters]:821) - Interpreter org.apache.zeppelin.spark.IPySparkInterpreter created for user: mvince, sessionId: shared_session INFO [2020-04-01 14:52:10,002] ({qtp701141022-19} InterpreterSetting.java[createInterpreters]:821) - Interpreter org.apache.zeppelin.spark.SparkRInterpreter created for user: mvince, sessionId: shared_session INFO [2020-04-01 14:52:10,002] ({qtp701141022-19} InterpreterSetting.java[createInterpreters]:821) - Interpreter org.apache.zeppelin.spark.SparkIRInterpreter created for user: mvince, sessionId: shared_session INFO [2020-04-01 14:52:10,002] ({qtp701141022-19} InterpreterSetting.java[createInterpreters]:821) - Interpreter org.apache.zeppelin.spark.SparkShinyInterpreter created for user: mvince, sessionId: shared_session INFO [2020-04-01 14:52:10,002] ({qtp701141022-19} InterpreterSetting.java[createInterpreters]:821) - Interpreter org.apache.zeppelin.spark.KotlinSparkInterpreter created for user: mvince, sessionId: shared_session INFO [2020-04-01 14:52:10,003] ({qtp701141022-19} ManagedInterpreterGroup.java[getOrCreateSession]:174) - Create Session: shared_session in InterpreterGroup: spark-mvince for user: mvince INFO [2020-04-01 14:52:10,003] ({qtp701141022-19} InterpreterSetting.java[getOrCreateInterpreterGroup]:466) - Create InterpreterGroup with groupId: livy-shared_process for user: mvince and note: 2ET5BSCWB INFO [2020-04-01 14:52:10,003] ({qtp701141022-19} InterpreterSetting.java[createInterpreters]:821) - Interpreter org.apache.zeppelin.livy.LivySparkInterpreter created for user: mvince, sessionId: mvince INFO [2020-04-01 14:52:10,003] ({qtp701141022-19} InterpreterSetting.java[createInterpreters]:821) - Interpreter org.apache.zeppelin.livy.LivySparkSQLInterpreter created for user: mvince, sessionId: mvince INFO [2020-04-01 14:52:10,003] ({qtp701141022-19} InterpreterSetting.java[createInterpreters]:821) - Interpreter org.apache.zeppelin.livy.LivyPySparkInterpreter created for user: mvince, sessionId: mvince INFO [2020-04-01 14:52:10,004] ({qtp701141022-19} InterpreterSetting.java[createInterpreters]:821) - Interpreter org.apache.zeppelin.livy.LivyPySpark3Interpreter created for user: mvince, sessionId: mvince INFO [2020-04-01 14:52:10,004] ({qtp701141022-19} InterpreterSetting.java[createInterpreters]:821) - Interpreter org.apache.zeppelin.livy.LivySparkRInterpreter created for user: mvince, sessionId: mvince INFO [2020-04-01 14:52:10,004] ({qtp701141022-19} InterpreterSetting.java[createInterpreters]:821) - Interpreter org.apache.zeppelin.livy.LivySharedInterpreter created for user: mvince, sessionId: mvince INFO [2020-04-01 14:52:10,004] ({qtp701141022-19} ManagedInterpreterGroup.java[getOrCreateSession]:174) - Create Session: mvince in InterpreterGroup: livy-shared_process for user: mvince WARN [2020-04-01 14:52:10,007] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EA84Z46N because its cron expression is empty. WARN [2020-04-01 14:52:10,012] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EJVKCCHQ because its cron expression is empty. WARN [2020-04-01 14:52:10,015] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EYT7Q6R8 because its cron expression is empty. WARN [2020-04-01 14:52:10,020] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2ECMC9HSM because its cron expression is empty. WARN [2020-04-01 14:52:10,028] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EG4PG1WY because its cron expression is empty. WARN [2020-04-01 14:52:10,036] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2F2AVWJ77 because its cron expression is empty. WARN [2020-04-01 14:52:10,036] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EBEQAYV2 because its cron expression is empty. WARN [2020-04-01 14:52:10,039] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2F1CHQ4TT because its cron expression is empty. WARN [2020-04-01 14:52:10,041] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2E8CQ9MS6 because its cron expression is empty. WARN [2020-04-01 14:52:10,041] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2ET5BSCWB because its cron expression is empty. WARN [2020-04-01 14:52:10,046] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EN2D3N9X because its cron expression is empty. WARN [2020-04-01 14:52:10,050] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EBJZ7B3G because its cron expression is empty. WARN [2020-04-01 14:52:10,050] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EBQRAERC because its cron expression is empty. WARN [2020-04-01 14:52:10,052] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EP8AG4BU because its cron expression is empty. WARN [2020-04-01 14:52:10,055] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EF6X5CJF because its cron expression is empty. INFO [2020-04-01 14:52:10,063] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:167) - Trigger cron for note: metrics_d0_last, with cron expression: 0 55 9 * * ? WARN [2020-04-01 14:52:10,067] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EJ23SKTS because its cron expression is empty. WARN [2020-04-01 14:52:10,070] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EK8SFBGG because its cron expression is empty. WARN [2020-04-01 14:52:10,072] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EE649P2Y because its cron expression is empty. INFO [2020-04-01 14:52:10,076] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:167) - Trigger cron for note: , with cron expression: 0 15 05 ? * * WARN [2020-04-01 14:52:10,081] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2ER62Y5VJ because its cron expression is empty. WARN [2020-04-01 14:52:10,084] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EVBRSEEH because its cron expression is empty. WARN [2020-04-01 14:52:10,084] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EA62S7HB because its cron expression is empty. WARN [2020-04-01 14:52:10,086] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2E67RTQ67 because its cron expression is empty. WARN [2020-04-01 14:52:10,088] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2E61STQJ5 because its cron expression is empty. WARN [2020-04-01 14:52:10,094] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EM2DXA7V because its cron expression is empty. WARN [2020-04-01 14:52:10,100] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EGD129A5 because its cron expression is empty. WARN [2020-04-01 14:52:10,104] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2F2YS7PCE because its cron expression is empty. WARN [2020-04-01 14:52:10,110] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EAFDAS3E because its cron expression is empty. WARN [2020-04-01 14:52:10,112] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2E6V3FRRP because its cron expression is empty. WARN [2020-04-01 14:52:10,114] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2E9JUE7SP because its cron expression is empty. WARN [2020-04-01 14:52:10,119] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EC93CJFY because its cron expression is empty. INFO [2020-04-01 14:52:10,122] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:167) - Trigger cron for note: metrics_by_dig_last, with cron expression: 0 50 9 * * ? WARN [2020-04-01 14:52:10,127] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EFQN2E5H because its cron expression is empty. WARN [2020-04-01 14:52:10,130] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2C57UKYWR because its cron expression is empty. WARN [2020-04-01 14:52:10,148] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2ED5VNXNT because its cron expression is empty. WARN [2020-04-01 14:52:10,149] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EFTZ976C because its cron expression is empty. WARN [2020-04-01 14:52:10,152] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EHAVGUBC because its cron expression is empty. WARN [2020-04-01 14:52:10,157] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EDZWEKC4 because its cron expression is empty. WARN [2020-04-01 14:52:10,159] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EJZYWBKU because its cron expression is empty. WARN [2020-04-01 14:52:10,159] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EK8HTF88 because its cron expression is empty. WARN [2020-04-01 14:52:10,193] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2F1S9ZY8Z because its cron expression is empty. WARN [2020-04-01 14:52:10,194] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EASTANXY because its cron expression is empty. WARN [2020-04-01 14:52:10,196] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EG2SXMMM because its cron expression is empty. WARN [2020-04-01 14:52:10,200] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EK2KDS2Z because its cron expression is empty. WARN [2020-04-01 14:52:10,202] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EF19QR3K because its cron expression is empty. WARN [2020-04-01 14:52:10,202] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EC2RVHSA because its cron expression is empty. WARN [2020-04-01 14:52:10,206] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EMJUV66E because its cron expression is empty. WARN [2020-04-01 14:52:10,208] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EBTQJJDQ because its cron expression is empty. WARN [2020-04-01 14:52:10,209] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EATUNUKB because its cron expression is empty. WARN [2020-04-01 14:52:10,213] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EYDJKFFY because its cron expression is empty. WARN [2020-04-01 14:52:10,219] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2C35YU814 because its cron expression is empty. WARN [2020-04-01 14:52:10,219] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EA55FQHT because its cron expression is empty. WARN [2020-04-01 14:52:10,223] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EQNNCYC9 because its cron expression is empty. WARN [2020-04-01 14:52:10,225] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EZ66TM57 because its cron expression is empty. WARN [2020-04-01 14:52:10,225] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2E97C8D5T because its cron expression is empty. WARN [2020-04-01 14:52:10,228] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EPZWH987 because its cron expression is empty. WARN [2020-04-01 14:52:10,235] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EAUE9CX7 because its cron expression is empty. INFO [2020-04-01 14:52:10,239] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:167) - Trigger cron for note: loading_time_first_last, with cron expression: 0 10 10 * * ? WARN [2020-04-01 14:52:10,241] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2E7XHUXJK because its cron expression is empty. WARN [2020-04-01 14:52:10,246] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2C2AUG798 because its cron expression is empty. WARN [2020-04-01 14:52:10,247] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EPKD684J because its cron expression is empty. WARN [2020-04-01 14:52:10,250] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EPANGK4J because its cron expression is empty. WARN [2020-04-01 14:52:10,252] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EW4HM8XQ because its cron expression is empty. WARN [2020-04-01 14:52:10,253] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EJ798ABT because its cron expression is empty. WARN [2020-04-01 14:52:10,265] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EKDR3PT7 because its cron expression is empty. WARN [2020-04-01 14:52:10,270] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2BYEZ5EVK because its cron expression is empty. WARN [2020-04-01 14:52:10,276] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2E72RM55E because its cron expression is empty. WARN [2020-04-01 14:52:10,283] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EJNNR6DK because its cron expression is empty. WARN [2020-04-01 14:52:10,286] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EHMR1ADA because its cron expression is empty. WARN [2020-04-01 14:52:10,291] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EMCU8KAC because its cron expression is empty. WARN [2020-04-01 14:52:10,300] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EWM84JXA because its cron expression is empty. WARN [2020-04-01 14:52:10,304] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EN1E1ATY because its cron expression is empty. WARN [2020-04-01 14:52:10,309] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EPP67VKS because its cron expression is empty. WARN [2020-04-01 14:52:10,357] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EPATQQDT because its cron expression is empty. INFO [2020-04-01 14:52:10,361] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:167) - Trigger cron for note: fleet_overview_last, with cron expression: 0 35 9 * * ? WARN [2020-04-01 14:52:10,365] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EG3KSZ91 because its cron expression is empty. WARN [2020-04-01 14:52:10,371] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EX6S4QSD because its cron expression is empty. WARN [2020-04-01 14:52:10,373] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EU19WKPQ because its cron expression is empty. WARN [2020-04-01 14:52:10,375] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EAGE4GN7 because its cron expression is empty. WARN [2020-04-01 14:52:10,380] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2ESRDQTJP because its cron expression is empty. WARN [2020-04-01 14:52:10,387] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EW19CSPA because its cron expression is empty. WARN [2020-04-01 14:52:10,392] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EJE2WJK4 because its cron expression is empty. WARN [2020-04-01 14:52:10,395] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EJJBKHPH because its cron expression is empty. WARN [2020-04-01 14:52:10,398] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EQ5G471X because its cron expression is empty. WARN [2020-04-01 14:52:10,401] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2E859M6CV because its cron expression is empty. WARN [2020-04-01 14:52:10,403] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2ENQDFSC3 because its cron expression is empty. WARN [2020-04-01 14:52:10,408] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EVH8FBP5 because its cron expression is empty. WARN [2020-04-01 14:52:10,410] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EM8RB72W because its cron expression is empty. WARN [2020-04-01 14:52:10,410] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EBPNK6HD because its cron expression is empty. WARN [2020-04-01 14:52:10,414] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EZFM3GJA because its cron expression is empty. WARN [2020-04-01 14:52:10,415] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2E9KJ2421 because its cron expression is empty. WARN [2020-04-01 14:52:10,415] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EGNSJ24Y because its cron expression is empty. WARN [2020-04-01 14:52:10,419] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2ENGYC9FS because its cron expression is empty. WARN [2020-04-01 14:52:10,420] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2E9CQC4EN because its cron expression is empty. WARN [2020-04-01 14:52:10,431] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2ETDNWNDD because its cron expression is empty. WARN [2020-04-01 14:52:10,433] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EKTB3H2B because its cron expression is empty. WARN [2020-04-01 14:52:10,434] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EFW2526S because its cron expression is empty. WARN [2020-04-01 14:52:10,437] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EQTYFAVQ because its cron expression is empty. WARN [2020-04-01 14:52:10,441] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2ECQJEPE3 because its cron expression is empty. WARN [2020-04-01 14:52:10,445] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EKMN5G7V because its cron expression is empty. WARN [2020-04-01 14:52:10,447] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EFZR463V because its cron expression is empty. WARN [2020-04-01 14:52:10,449] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EJS1HGKH because its cron expression is empty. WARN [2020-04-01 14:52:10,449] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EBT34D6C because its cron expression is empty. INFO [2020-04-01 14:52:10,451] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:167) - Trigger cron for note: levelup_last, with cron expression: 0 00 10 * * ? WARN [2020-04-01 14:52:10,453] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EHX3MGHH because its cron expression is empty. WARN [2020-04-01 14:52:10,456] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2E5P699H6 because its cron expression is empty. WARN [2020-04-01 14:52:10,457] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2E7WFV4CG because its cron expression is empty. WARN [2020-04-01 14:52:10,460] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EQMM2A9W because its cron expression is empty. WARN [2020-04-01 14:52:10,465] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2ENDE3NH9 because its cron expression is empty. WARN [2020-04-01 14:52:10,466] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EYD56B9B because its cron expression is empty. INFO [2020-04-01 14:52:10,469] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:167) - Trigger cron for note: tutorial_last, with cron expression: 0 20 10 * * ? WARN [2020-04-01 14:52:10,470] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2ECHQRNZA because its cron expression is empty. WARN [2020-04-01 14:52:10,472] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EGT4EH9G because its cron expression is empty. WARN [2020-04-01 14:52:10,473] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2ERA45BAY because its cron expression is empty. WARN [2020-04-01 14:52:10,476] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2A94M5J1Z because its cron expression is empty. WARN [2020-04-01 14:52:10,481] ({Init CronJob Thread} QuartzSchedulerService.java[refreshCron]:132) - Skip refresh cron of note 2EYUV26VR because its cron expression is empty. INFO [2020-04-01 14:52:10,481] ({Init CronJob Thread} QuartzSchedulerService.java[lambda$new$1]:81) - Complete init cronjobs WARN [2020-04-01 14:52:11,107] ({qtp701141022-19} NotebookServer.java[onFailure]:2041) - HTTP 404 Not Found WARN [2020-04-01 14:52:11,154] ({qtp701141022-18} NotebookServer.java[onFailure]:2041) - HTTP 404 Not Found INFO [2020-04-01 14:52:26,883] ({qtp701141022-20} Helium.java[loadConf]:140) - Add helium local registry /opt/zeppelin-server/helium WARN [2020-04-01 14:52:26,884] ({qtp701141022-20} Helium.java[loadConf]:148) - /opt/zeppelin-server/conf/helium.json does not exists INFO [2020-04-01 14:52:29,117] ({qtp701141022-19} NotebookService.java[runParagraph]:293) - Start to run paragraph: paragraph_1571833609109_-338708939 of note: 2ET5BSCWB INFO [2020-04-01 14:52:29,118] ({qtp701141022-19} VFSNotebookRepo.java[save]:145) - Saving note 2ET5BSCWB to mvince/spark_test_2ET5BSCWB.zpln INFO [2020-04-01 14:52:29,139] ({qtp701141022-19} SchedulerFactory.java[<init>]:62) - Scheduler Thread Pool Size: 100 INFO [2020-04-01 14:52:29,156] ({SchedulerFactory2} AbstractScheduler.java[runJob]:125) - Job paragraph_1571833609109_-338708939 started by scheduler RemoteInterpreter-spark-mvince-shared_session INFO [2020-04-01 14:52:29,157] ({SchedulerFactory2} Paragraph.java[jobRun]:407) - Run paragraph [paragraph_id: paragraph_1571833609109_-338708939, interpreter: org.apache.zeppelin.spark.SparkInterpreter, note_id: 2ET5BSCWB, user: mvince] INFO [2020-04-01 14:52:29,157] ({SchedulerFactory2} ManagedInterpreterGroup.java[getOrCreateInterpreterProcess]:62) - Create InterpreterProcess for InterpreterGroup: spark-mvince INFO [2020-04-01 14:52:29,159] ({SchedulerFactory2} PluginManager.java[loadInterpreterLauncher]:141) - Loading Interpreter Launcher Plugin: SparkInterpreterLauncher INFO [2020-04-01 14:52:29,163] ({SchedulerFactory2} StandardInterpreterLauncher.java[launch]:49) - Launching Interpreter: spark INFO [2020-04-01 14:52:29,944] ({SchedulerFactory2} SparkInterpreterLauncher.java[buildEnvFromProperties]:182) - Run Spark under secure mode with keytab: /etc/security/keytabs/zeppelin.server.kerberos.keytab, principal: zeppelin-dwh_prod_ht@mynetwork.LOCAL INFO [2020-04-01 14:52:29,959] ({SchedulerFactory2} ProcessLauncher.java[transition]:109) - Process state is transitioned to LAUNCHED INFO [2020-04-01 14:52:29,959] ({SchedulerFactory2} ProcessLauncher.java[launch]:96) - Process is launched: [/opt/zeppelin-server/bin/interpreter.sh, -d, /opt/zeppelin-server/interpreter/spark, -c, 192.168.200.65, -p, 38198, -r, :, -i, spark-mvince, -u, mvince, -l, /opt/zeppelin-server/local-repo/spark, -g, spark] INFO [2020-04-01 14:52:29,999] ({Exec Stream Pumper} ProcessLauncher.java[processLine]:188) - Interpreter launch command: /usr/hdp/2.6.5.0-292/spark2/bin/spark-submit --class org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer --driver-class-path ":/opt/zeppelin-server/interpreter/spark/*::/opt/zeppelin-server/interpreter/zeppelin-interpreter-shaded-0.9.0-SNAPSHOT.jar:/opt/zeppelin-server/interpreter/spark/spark-interpreter-0.9.0-SNAPSHOT.jar:/usr/hdp/2.6.5.0-292/spark2/jars/*:/etc/hadoop/conf" --driver-java-options " -Dfile.encoding=UTF-8 -Dlog4j.configuration=log4j_yarn_cluster.properties -Dzeppelin.log.file='/opt/zeppelin-server/logs/zeppelin-interpreter-spark-mvince-mvince-zeppelin-dwh-prod-ht-edge-05.mynetwork-dc.local.log'" --master yarn --conf spark\.yarn\.dist\.archives\=\/usr\/hdp\/2\.6\.5\.0-292\/spark2\/R\/lib\/sparkr\.zip\#sparkr --conf spark\.yarn\.isPython\=true --conf spark\.app\.name\=test_drive --conf spark\.webui\.yarn\.useProxy\=false --conf spark\.driver\.cores\=1 --conf spark\.yarn\.maxAppAttempts\=1 --conf spark\.executor\.memory\=1g --conf spark\.files\=\/opt\/zeppelin-server\/conf\/log4j_yarn_cluster\.properties --conf spark\.driver\.memory\=1g --conf spark\.jars\=\/opt\/zeppelin-server\/interpreter\/spark\/scala-2\.11\/spark-scala-2\.11-0\.9\.0-SNAPSHOT\.jar\,\/opt\/zeppelin-server\/interpreter\/zeppelin-interpreter-shaded-0\.9\.0-SNAPSHOT\.jar --conf spark\.submit\.deployMode\=cluster --conf spark\.executor\.cores\=1 --conf spark\.yarn\.submit\.waitAppCompletion\=false --proxy-user mvince /opt/zeppelin-server/interpreter/spark/spark-interpreter-0.9.0-SNAPSHOT.jar 192.168.200.65 38198 "spark-mvince" : WARN [2020-04-01 14:52:35,887] ({Exec Default Executor} RemoteInterpreterManagedProcess.java[onProcessComplete]:263) - Process is exited with exit value 0 INFO [2020-04-01 14:52:35,887] ({Exec Default Executor} ProcessLauncher.java[transition]:109) - Process state is transitioned to COMPLETED INFO [2020-04-01 14:52:41,903] ({pool-6-thread-1} ProcessLauncher.java[transition]:109) - Process state is transitioned to RUNNING INFO [2020-04-01 14:52:41,960] ({SchedulerFactory2} RemoteInterpreter.java[call]:171) - Create RemoteInterpreter org.apache.zeppelin.spark.SparkInterpreter INFO [2020-04-01 14:52:42,215] ({SchedulerFactory2} RemoteInterpreter.java[call]:171) - Create RemoteInterpreter org.apache.zeppelin.spark.SparkSqlInterpreter INFO [2020-04-01 14:52:42,218] ({SchedulerFactory2} RemoteInterpreter.java[call]:171) - Create RemoteInterpreter org.apache.zeppelin.spark.PySparkInterpreter INFO [2020-04-01 14:52:42,224] ({SchedulerFactory2} RemoteInterpreter.java[call]:171) - Create RemoteInterpreter org.apache.zeppelin.spark.IPySparkInterpreter INFO [2020-04-01 14:52:42,232] ({SchedulerFactory2} RemoteInterpreter.java[call]:171) - Create RemoteInterpreter org.apache.zeppelin.spark.SparkRInterpreter INFO [2020-04-01 14:52:42,237] ({SchedulerFactory2} RemoteInterpreter.java[call]:171) - Create RemoteInterpreter org.apache.zeppelin.spark.SparkIRInterpreter INFO [2020-04-01 14:52:42,240] ({SchedulerFactory2} RemoteInterpreter.java[call]:171) - Create RemoteInterpreter org.apache.zeppelin.spark.SparkShinyInterpreter INFO [2020-04-01 14:52:42,244] ({SchedulerFactory2} RemoteInterpreter.java[call]:171) - Create RemoteInterpreter org.apache.zeppelin.spark.KotlinSparkInterpreter INFO [2020-04-01 14:52:42,347] ({SchedulerFactory2} RemoteInterpreter.java[call]:141) - Open RemoteInterpreter org.apache.zeppelin.spark.SparkInterpreter INFO [2020-04-01 14:52:42,347] ({SchedulerFactory2} RemoteInterpreter.java[pushAngularObjectRegistryToRemote]:431) - Push local angular object registry from ZeppelinServer to remote interpreter group spark-mvince WARN [2020-04-01 14:52:44,434] ({SchedulerFactory2} NotebookServer.java[onStatusChange]:1901) - Job paragraph_1571833609109_-338708939 is finished, status: ERROR, exception: null, result: %text org.apache.zeppelin.interpreter.InterpreterException: org.apache.zeppelin.interpreter.InterpreterException: Fail to open SparkInterpreter at org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:76) at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:668) at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:577) at org.apache.zeppelin.scheduler.Job.run(Job.java:172) at org.apache.zeppelin.scheduler.AbstractScheduler.runJob(AbstractScheduler.java:130) at org.apache.zeppelin.scheduler.FIFOScheduler.lambda$runJobInScheduler$0(FIFOScheduler.java:39) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Caused by: org.apache.zeppelin.interpreter.InterpreterException: Fail to open SparkInterpreter at org.apache.zeppelin.spark.SparkInterpreter.open(SparkInterpreter.java:114) at org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:70) ... 8 more Caused by: scala.reflect.internal.MissingRequirementError: object java.lang.Object in compiler mirror not found. at scala.reflect.internal.MissingRequirementError$.signal(MissingRequirementError.scala:17) at scala.reflect.internal.MissingRequirementError$.notFound(MissingRequirementError.scala:18) at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:53) at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:45) at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:45) at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:66) at scala.reflect.internal.Mirrors$RootsBase.getClassByName(Mirrors.scala:102) at scala.reflect.internal.Mirrors$RootsBase.getRequiredClass(Mirrors.scala:105) at scala.reflect.internal.Definitions$DefinitionsClass.ObjectClass$lzycompute(Definitions.scala:257) at scala.reflect.internal.Definitions$DefinitionsClass.ObjectClass(Definitions.scala:257) at scala.reflect.internal.Definitions$DefinitionsClass.init(Definitions.scala:1394) at scala.tools.nsc.Global$Run.<init>(Global.scala:1215) at scala.tools.nsc.interpreter.IMain.compileSourcesKeepingRun(IMain.scala:432) at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.compileAndSaveRun(IMain.scala:855) at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.compile(IMain.scala:813) at scala.tools.nsc.interpreter.IMain.bind(IMain.scala:675) at scala.tools.nsc.interpreter.IMain.bind(IMain.scala:712) at scala.tools.nsc.interpreter.IMain$$anonfun$quietBind$1.apply(IMain.scala:711) at scala.tools.nsc.interpreter.IMain$$anonfun$quietBind$1.apply(IMain.scala:711) at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214) at scala.tools.nsc.interpreter.IMain.quietBind(IMain.scala:711) at org.apache.zeppelin.spark.SparkScala211Interpreter$.loopPostInit$1(SparkScala211Interpreter.scala:164) at org.apache.zeppelin.spark.SparkScala211Interpreter$.org$apache$zeppelin$spark$SparkScala211Interpreter$$loopPostInit(SparkScala211Interpreter.scala:199) at org.apache.zeppelin.spark.SparkScala211Interpreter.open(SparkScala211Interpreter.scala:95) at org.apache.zeppelin.spark.SparkInterpreter.open(SparkInterpreter.java:98) ... 9 more INFO [2020-04-01 14:52:44,435] ({SchedulerFactory2} VFSNotebookRepo.java[save]:145) - Saving note 2ET5BSCWB to mvince/spark_test_2ET5BSCWB.zpln INFO [2020-04-01 14:52:44,438] ({SchedulerFactory2} AbstractScheduler.java[runJob]:152) - Job paragraph_1571833609109_-338708939 finished by scheduler RemoteInterpreter-spark-mvince-shared_session
lrwxrwxrwx 1 mvince hadoop 88 Apr 1 14:52 __app__.jar -> /data/00/yarn/local/usercache/mvince/filecache/1883/spark-interpreter-0.9.0-SNAPSHOT.jar -rw------- 1 mvince hadoop 364 Apr 1 14:52 container_tokens -rwx------ 1 mvince hadoop 8177 Apr 1 14:52 launch_container.sh lrwxrwxrwx 1 mvince hadoop 81 Apr 1 14:52 log4j_yarn_cluster.properties -> /data/00/yarn/local/usercache/mvince/filecache/1878/log4j_yarn_cluster.properties lrwxrwxrwx 1 mvince hadoop 71 Apr 1 14:52 py4j-0.10.6-src.zip -> /data/02/yarn/local/usercache/mvince/filecache/1880/py4j-0.10.6-src.zip lrwxrwxrwx 1 mvince hadoop 63 Apr 1 14:52 pyspark.zip -> /data/01/yarn/local/usercache/mvince/filecache/1879/pyspark.zip lrwxrwxrwx 1 mvince hadoop 70 Apr 1 14:52 __spark_conf__ -> /data/04/yarn/local/usercache/mvince/filecache/1882/__spark_conf__.zip drwxr-s--- 2 mvince hadoop 4096 Apr 1 14:52 __spark_libs__ lrwxrwxrwx 1 mvince hadoop 62 Apr 1 14:52 sparkr -> /data/04/yarn/local/usercache/mvince/filecache/1877/sparkr.zip lrwxrwxrwx 1 mvince hadoop 87 Apr 1 14:52 spark-scala-2.11-0.9.0-SNAPSHOT.jar -> /data/03/yarn/local/usercache/mvince/filecache/1881/spark-scala-2.11-0.9.0-SNAPSHOT.jar drwxr-s--- 2 mvince hadoop 4096 Apr 1 14:52 tmp lrwxrwxrwx 1 mvince hadoop 98 Apr 1 14:52 zeppelin-interpreter-shaded-0.9.0-SNAPSHOT.jar -> /data/01/yarn/local/usercache/mvince/filecache/1884/zeppelin-interpreter-shaded-0.9.0-SNAPSHOT.jar find -L . -maxdepth 5 -ls: 61343077 4 drwxr-s--- 4 mvince hadoop 4096 Apr 1 14:52 . 61343078 8 -rwx------ 1 mvince hadoop 8177 Apr 1 14:52 ./launch_container.sh 116262286 4 drwx------ 3 mvince mvince 4096 Apr 1 14:52 ./sparkr 116393329 4 drwx------ 9 mvince mvince 4096 Apr 1 14:52 ./sparkr/SparkR 117048744 4 drwx------ 2 mvince mvince 4096 Apr 1 14:52 ./sparkr/SparkR/profile 117048745 4 -r-x------ 1 mvince mvince 958 Apr 1 14:52 ./sparkr/SparkR/profile/general.R 117048746 4 -r-x------ 1 mvince mvince 1715 Apr 1 14:52 ./sparkr/SparkR/profile/shell.R 116655509 4 drwx------ 3 mvince mvince 4096 Apr 1 14:52 ./sparkr/SparkR/tests 116786560 4 drwx------ 2 mvince mvince 4096 Apr 1 14:52 ./sparkr/SparkR/tests/testthat 116786561 4 -r-x------ 1 mvince mvince 3561 Apr 1 14:52 ./sparkr/SparkR/tests/testthat/test_basic.R 117441940 4 drwx------ 2 mvince mvince 4096 Apr 1 14:52 ./sparkr/SparkR/help 117441944 4 -r-x------ 1 mvince mvince 60 Apr 1 14:52 ./sparkr/SparkR/help/aliases.rds 117441942 4 -r-x------ 1 mvince mvince 113 Apr 1 14:52 ./sparkr/SparkR/help/SparkR.rdx 117441941 0 -r-x------ 1 mvince mvince 0 Apr 1 14:52 ./sparkr/SparkR/help/AnIndex 117441943 0 -r-x------ 1 mvince mvince 0 Apr 1 14:52 ./sparkr/SparkR/help/SparkR.rdb 117441945 4 -r-x------ 1 mvince mvince 64 Apr 1 14:52 ./sparkr/SparkR/help/paths.rds 117310871 4 drwx------ 2 mvince mvince 4096 Apr 1 14:52 ./sparkr/SparkR/html 117310873 4 -r-x------ 1 mvince mvince 937 Apr 1 14:52 ./sparkr/SparkR/html/00Index.html 117310872 4 -r-x------ 1 mvince mvince 1319 Apr 1 14:52 ./sparkr/SparkR/html/R.css 116524385 4 drwx------ 2 mvince mvince 4096 Apr 1 14:52 ./sparkr/SparkR/Meta 116524387 4 -r-x------ 1 mvince mvince 2929 Apr 1 14:52 ./sparkr/SparkR/Meta/nsInfo.rds 116524386 4 -r-x------ 1 mvince mvince 1151 Apr 1 14:52 ./sparkr/SparkR/Meta/package.rds 116524388 4 -r-x------ 1 mvince mvince 188 Apr 1 14:52 ./sparkr/SparkR/Meta/hsearch.rds 116524390 4 -r-x------ 1 mvince mvince 38 Apr 1 14:52 ./sparkr/SparkR/Meta/links.rds 116524389 4 -r-x------ 1 mvince mvince 164 Apr 1 14:52 ./sparkr/SparkR/Meta/Rd.rds 116917684 4 drwx------ 2 mvince mvince 4096 Apr 1 14:52 ./sparkr/SparkR/worker 116917686 4 -r-x------ 1 mvince mvince 3991 Apr 1 14:52 ./sparkr/SparkR/worker/daemon.R 116917685 12 -r-x------ 1 mvince mvince 8949 Apr 1 14:52 ./sparkr/SparkR/worker/worker.R 116393330 4 -r-x------ 1 mvince mvince 1617 Apr 1 14:52 ./sparkr/SparkR/DESCRIPTION 116393331 16 -r-x------ 1 mvince mvince 12807 Apr 1 14:52 ./sparkr/SparkR/NAMESPACE 117179801 4 drwx------ 2 mvince mvince 4096 Apr 1 14:52 ./sparkr/SparkR/R 117179802 28 -r-x------ 1 mvince mvince 26662 Apr 1 14:52 ./sparkr/SparkR/R/SparkR.rdx 117179803 1084 -r-x------ 1 mvince mvince 1109793 Apr 1 14:52 ./sparkr/SparkR/R/SparkR.rdb 117179804 4 -r-x------ 1 mvince mvince 1056 Apr 1 14:52 ./sparkr/SparkR/R/SparkR 61212086 532 -r-x------ 1 mvince mvince 541536 Apr 1 14:52 ./pyspark.zip 11404735 76 -r-x------ 1 mvince mvince 75035 Apr 1 14:52 ./spark-scala-2.11-0.9.0-SNAPSHOT.jar 118100066 4 -r-x------ 1 mvince mvince 1019 Apr 1 14:52 ./log4j_yarn_cluster.properties 118362129 61100 -r-x------ 1 mvince mvince 62565313 Apr 1 14:52 ./__app__.jar 18351295 80 -r-x------ 1 mvince mvince 80352 Apr 1 14:52 ./py4j-0.10.6-src.zip 61736356 4 drwxr-s--- 2 mvince hadoop 4096 Apr 1 14:52 ./tmp 61867326 4 drwxr-s--- 2 mvince hadoop 4096 Apr 1 14:52 ./__spark_libs__ 7340053 168 -r-xr-xr-x 1 yarn hadoop 168693 Jan 11 2019 ./__spark_libs__/logstash-gelf-1.12.0.jar 110887458 4 -r-xr-xr-x 1 yarn hadoop 2 Jun 29 2018 ./__spark_libs__/dummy.jar 117835091 4 drwx------ 3 mvince mvince 4096 Apr 1 14:52 ./__spark_conf__ 117966262 4 drwx------ 2 mvince mvince 4096 Apr 1 14:52 ./__spark_conf__/__hadoop_conf__ 117966299 4 -r-x------ 1 mvince mvince 984 Apr 1 14:52 ./__spark_conf__/__hadoop_conf__/mapred-env.sh 117966266 4 -r-x------ 1 mvince mvince 2404 Apr 1 14:52 ./__spark_conf__/__hadoop_conf__/hadoop-metrics2.properties 117966304 4 -r-x------ 1 mvince mvince 2316 Apr 1 14:52 ./__spark_conf__/__hadoop_conf__/ssl-client.xml.example 117966282 4 -r-x------ 1 mvince mvince 1335 Apr 1 14:52 ./__spark_conf__/__hadoop_conf__/configuration.xsl 117966308 4 -r-x------ 1 mvince mvince 1000 Apr 1 14:52 ./__spark_conf__/__hadoop_conf__/ssl-server.xml 117966268 12 -r-x------ 1 mvince mvince 8994 Apr 1 14:52 ./__spark_conf__/__hadoop_conf__/core-site.xml 117966291 12 -r-x------ 1 mvince mvince 10938 Apr 1 14:52 ./__spark_conf__/__hadoop_conf__/hdfs-site.xml 117966294 4 -r-x------ 1 mvince mvince 320 Apr 1 14:52 ./__spark_conf__/__hadoop_conf__/hdfs_nn_jaas.conf 117966263 8 -r-x------ 1 mvince mvince 6586 Apr 1 14:52 ./__spark_conf__/__hadoop_conf__/hadoop-env.sh 117966305 4 -r-x------ 1 mvince mvince 1308 Apr 1 14:52 ./__spark_conf__/__hadoop_conf__/hadoop-policy.xml 117966297 8 -r-x------ 1 mvince mvince 4221 Apr 1 14:52 ./__spark_conf__/__hadoop_conf__/task-log4j.properties 117966301 4 -r-x------ 1 mvince mvince 307 Apr 1 14:52 ./__spark_conf__/__hadoop_conf__/mapred_jaas.conf 117966296 4 -r-x------ 1 mvince mvince 545 Apr 1 14:52 ./__spark_conf__/__hadoop_conf__/yarn_jaas.conf 117966270 4 -r-x------ 1 mvince mvince 884 Apr 1 14:52 ./__spark_conf__/__hadoop_conf__/ssl-client.xml 117966267 28 -r-x------ 1 mvince mvince 24889 Apr 1 14:52 ./__spark_conf__/__hadoop_conf__/yarn-site.xml 117966295 4 -r-x------ 1 mvince mvince 1249 Apr 1 14:52 ./__spark_conf__/__hadoop_conf__/container-executor.cfg 117966311 4 -r-x------ 1 mvince mvince 945 Apr 1 14:52 ./__spark_conf__/__hadoop_conf__/taskcontroller.cfg 117966287 8 -r-x------ 1 mvince mvince 5522 Apr 1 14:52 ./__spark_conf__/__hadoop_conf__/yarn-env.sh 117966279 4 -r-x------ 1 mvince mvince 320 Apr 1 14:52 ./__spark_conf__/__hadoop_conf__/yarn_nm_jaas.conf 117966292 4 -r-x------ 1 mvince mvince 320 Apr 1 14:52 ./__spark_conf__/__hadoop_conf__/hdfs_dn_jaas.conf 117966300 4 -r-x------ 1 mvince mvince 1602 Apr 1 14:52 ./__spark_conf__/__hadoop_conf__/health_check 117966310 4 -r-x------ 1 mvince mvince 2697 Apr 1 14:52 ./__spark_conf__/__hadoop_conf__/ssl-server.xml.example 117966278 4 -r-x------ 1 mvince mvince 2358 Apr 1 14:52 ./__spark_conf__/__hadoop_conf__/topology_script.py 117966309 4 -r-x------ 1 mvince mvince 320 Apr 1 14:52 ./__spark_conf__/__hadoop_conf__/hdfs_jn_jaas.conf 117966306 4 -r-x------ 1 mvince mvince 305 Apr 1 14:52 ./__spark_conf__/__hadoop_conf__/slaves 117966264 8 -r-x------ 1 mvince mvince 7601 Apr 1 14:52 ./__spark_conf__/__hadoop_conf__/mapred-site.xml 117966265 12 -r-x------ 1 mvince mvince 10570 Apr 1 14:52 ./__spark_conf__/__hadoop_conf__/log4j.properties 117966303 4 -r-x------ 1 mvince mvince 1311 Apr 1 14:52 ./__spark_conf__/__hadoop_conf__/hive-site.xml 117966298 4 -r-x------ 1 mvince mvince 2490 Apr 1 14:52 ./__spark_conf__/__hadoop_conf__/hadoop-metrics.properties 117966273 8 -r-x------ 1 mvince mvince 6332 Apr 1 14:52 ./__spark_conf__/__hadoop_conf__/capacity-scheduler.xml 117966307 4 -r-x------ 1 mvince mvince 667 Apr 1 14:52 ./__spark_conf__/__hadoop_conf__/topology_mappings.data 117966290 4 -r-x------ 1 mvince mvince 324 Apr 1 14:52 ./__spark_conf__/__hadoop_conf__/yarn_ats_jaas.conf 117966293 4 -r-x------ 1 mvince mvince 1020 Apr 1 14:52 ./__spark_conf__/__hadoop_conf__/commons-logging.properties 117835094 132 -r-x------ 1 mvince mvince 134635 Apr 1 14:52 ./__spark_conf__/__spark_hadoop_conf__.xml 117835093 8 -r-x------ 1 mvince mvince 4956 Apr 1 14:52 ./__spark_conf__/metrics.properties 117835099 8 -r-x------ 1 mvince mvince 4258 Apr 1 14:52 ./__spark_conf__/__spark_conf__.properties 117835092 4 -r-x------ 1 mvince mvince 1139 Apr 1 14:52 ./__spark_conf__/log4j.properties 61343079 4 -rw------- 1 mvince hadoop 364 Apr 1 14:52 ./container_tokens 61474199 22868 -r-x------ 1 mvince mvince 23412838 Apr 1 14:52 ./zeppelin-interpreter-shaded-0.9.0-SNAPSHOT.jar broken symlinks(find -L . -maxdepth 5 -type l -ls):
INFO [2020-04-01 14:52:39,894] ({main} Logging.scala[logInfo]:54) - Registered signal handler for TERM INFO [2020-04-01 14:52:39,897] ({main} Logging.scala[logInfo]:54) - Registered signal handler for HUP INFO [2020-04-01 14:52:39,899] ({main} Logging.scala[logInfo]:54) - Registered signal handler for INT INFO [2020-04-01 14:52:40,031] ({main} Logging.scala[logInfo]:54) - Changing view acls to: mvince INFO [2020-04-01 14:52:40,031] ({main} Logging.scala[logInfo]:54) - Changing modify acls to: mvince INFO [2020-04-01 14:52:40,031] ({main} Logging.scala[logInfo]:54) - Changing view acls groups to: INFO [2020-04-01 14:52:40,032] ({main} Logging.scala[logInfo]:54) - Changing modify acls groups to: INFO [2020-04-01 14:52:40,032] ({main} Logging.scala[logInfo]:54) - SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(mvince); groups with view permissions: Set(); users with modify permissions: Set(mvince); groups with modify permissions: Set() DEBUG [2020-04-01 14:52:40,043] ({main} Logging.scala[logDebug]:58) - Created SSL options for fs: SSLOptions{enabled=false, keyStore=None, keyStorePassword=None, trustStore=None, trustStorePassword=None, protocol=None, enabledAlgorithms=Set()} DEBUG [2020-04-01 14:52:40,130] ({main} Shell.java[isSetsidSupported]:778) - setsid exited with exit code 0 DEBUG [2020-04-01 14:52:40,342] ({main} MutableMetricsFactory.java[newForField]:42) - field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, always=false, sampleName=Ops, type=DEFAULT, valueName=Time, value=[Rate of successful kerberos logins and latency (milliseconds)]) DEBUG [2020-04-01 14:52:40,352] ({main} MutableMetricsFactory.java[newForField]:42) - field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, always=false, sampleName=Ops, type=DEFAULT, valueName=Time, value=[Rate of failed kerberos logins and latency (milliseconds)]) DEBUG [2020-04-01 14:52:40,352] ({main} MutableMetricsFactory.java[newForField]:42) - field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, always=false, sampleName=Ops, type=DEFAULT, valueName=Time, value=[GetGroups]) DEBUG [2020-04-01 14:52:40,353] ({main} MutableMetricsFactory.java[newForField]:42) - field private org.apache.hadoop.metrics2.lib.MutableGaugeLong org.apache.hadoop.security.UserGroupInformation$UgiMetrics.renewalFailuresTotal with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, always=false, sampleName=Ops, type=DEFAULT, valueName=Time, value=[Renewal failures since startup]) DEBUG [2020-04-01 14:52:40,353] ({main} MutableMetricsFactory.java[newForField]:42) - field private org.apache.hadoop.metrics2.lib.MutableGaugeInt org.apache.hadoop.security.UserGroupInformation$UgiMetrics.renewalFailures with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, always=false, sampleName=Ops, type=DEFAULT, valueName=Time, value=[Renewal failures since last successful login]) DEBUG [2020-04-01 14:52:40,354] ({main} MetricsSystemImpl.java[register]:232) - UgiMetrics, User and group related metrics DEBUG [2020-04-01 14:52:40,393] ({main} SecurityUtil.java[setTokenServiceUseIp]:116) - Setting hadoop.security.token.service.use_ip to true DEBUG [2020-04-01 14:52:40,423] ({main} Groups.java[getUserToGroupsMappingService]:418) - Creating new Groups object DEBUG [2020-04-01 14:52:40,424] ({main} NativeCodeLoader.java[<clinit>]:46) - Trying to load the custom-built native-hadoop library... DEBUG [2020-04-01 14:52:40,425] ({main} NativeCodeLoader.java[<clinit>]:50) - Loaded the native-hadoop library DEBUG [2020-04-01 14:52:40,425] ({main} JniBasedUnixGroupsMapping.java[<clinit>]:50) - Using JniBasedUnixGroupsMapping for Group resolution DEBUG [2020-04-01 14:52:40,425] ({main} JniBasedUnixGroupsMappingWithFallback.java[<init>]:45) - Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMapping DEBUG [2020-04-01 14:52:40,490] ({main} Groups.java[<init>]:145) - Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback; cacheTimeout=300000; warningDeltaMs=5000 DEBUG [2020-04-01 14:52:40,495] ({main} UserGroupInformation.java[login]:247) - hadoop login DEBUG [2020-04-01 14:52:40,496] ({main} UserGroupInformation.java[commit]:182) - hadoop login commit DEBUG [2020-04-01 14:52:40,497] ({main} UserGroupInformation.java[commit]:196) - using kerberos user:null DEBUG [2020-04-01 14:52:40,498] ({main} UserGroupInformation.java[commit]:212) - using local user:UnixPrincipal: mvince DEBUG [2020-04-01 14:52:40,498] ({main} UserGroupInformation.java[commit]:218) - Using user: "UnixPrincipal: mvince" with name mvince DEBUG [2020-04-01 14:52:40,499] ({main} UserGroupInformation.java[commit]:228) - User entry: "mvince" DEBUG [2020-04-01 14:52:40,499] ({main} UserGroupInformation.java[loginUserFromSubject]:855) - Assuming keytab is managed externally since logged in from subject. DEBUG [2020-04-01 14:52:40,501] ({main} UserGroupInformation.java[loginUserFromSubject]:894) - Reading credentials from location set in HADOOP_TOKEN_FILE_LOCATION: /data/01/yarn/local/usercache/mvince/appcache/application_1573680659125_670094/container_e150_1573680659125_670094_01_000001/container_tokens DEBUG [2020-04-01 14:52:40,505] ({main} UserGroupInformation.java[loginUserFromSubject]:905) - Loaded 3 tokens DEBUG [2020-04-01 14:52:40,506] ({main} UserGroupInformation.java[loginUserFromSubject]:914) - UGI loginUser:mvince (auth:KERBEROS) DEBUG [2020-04-01 14:52:40,577] ({main} Logging.scala[logDebug]:58) - creating UGI for user: mvince DEBUG [2020-04-01 14:52:40,578] ({main} UserGroupInformation.java[logPrivilegedAction]:1896) - PrivilegedAction as:mvince (auth:SIMPLE) from:org.apache.spark.deploy.yarn.ApplicationMaster.doAsUser(ApplicationMaster.scala:814) DEBUG [2020-04-01 14:52:40,622] ({main} UserGroupInformation.java[logPrivilegedAction]:1896) - PrivilegedAction as:mvince (auth:SIMPLE) from:org.apache.spark.deploy.yarn.ApplicationMaster.doAsUser(ApplicationMaster.scala:814) INFO [2020-04-01 14:52:40,622] ({main} Logging.scala[logInfo]:54) - Preparing Local resources DEBUG [2020-04-01 14:52:40,899] ({main} GoogleHadoopFileSystemBase.java[<clinit>]:642) - GHFS version: 1.8.1.2.6.5.0-292 DEBUG [2020-04-01 14:52:40,985] ({main} DFSClient.java[<init>]:474) - dfs.client.use.legacy.blockreader.local = false DEBUG [2020-04-01 14:52:40,985] ({main} DFSClient.java[<init>]:477) - dfs.client.read.shortcircuit = true DEBUG [2020-04-01 14:52:40,985] ({main} DFSClient.java[<init>]:480) - dfs.client.domain.socket.data.traffic = false DEBUG [2020-04-01 14:52:40,985] ({main} DFSClient.java[<init>]:483) - dfs.domain.socket.path = /var/lib/hadoop-hdfs/dn_socket DEBUG [2020-04-01 14:52:41,054] ({main} SecurityUtil.java[setTokenService]:421) - Acquired token Kind: HDFS_DELEGATION_TOKEN, Service: 192.168.200.21:8020, Ident: (HDFS_DELEGATION_TOKEN token 3788277 for mvince) DEBUG [2020-04-01 14:52:41,055] ({main} HAUtil.java[cloneDelegationTokenForLogicalUri]:323) - Mapped HA service delegation token for logical URI hdfs://dwhprodht/user/mvince/.sparkStaging/application_1573680659125_670094/__spark_conf__.zip to namenode dwh-prod-ht-master-01.mynetwork-dc.local/192.168.200.21:8020 DEBUG [2020-04-01 14:52:41,055] ({main} SecurityUtil.java[setTokenService]:421) - Acquired token Kind: HDFS_DELEGATION_TOKEN, Service: 192.168.200.22:8020, Ident: (HDFS_DELEGATION_TOKEN token 3788277 for mvince) DEBUG [2020-04-01 14:52:41,055] ({main} HAUtil.java[cloneDelegationTokenForLogicalUri]:323) - Mapped HA service delegation token for logical URI hdfs://dwhprodht/user/mvince/.sparkStaging/application_1573680659125_670094/__spark_conf__.zip to namenode dwh-prod-ht-master-02.mynetwork-dc.local/192.168.200.22:8020 DEBUG [2020-04-01 14:52:41,056] ({main} DFSClient.java[<init>]:474) - dfs.client.use.legacy.blockreader.local = false DEBUG [2020-04-01 14:52:41,056] ({main} DFSClient.java[<init>]:477) - dfs.client.read.shortcircuit = true DEBUG [2020-04-01 14:52:41,056] ({main} DFSClient.java[<init>]:480) - dfs.client.domain.socket.data.traffic = false DEBUG [2020-04-01 14:52:41,056] ({main} DFSClient.java[<init>]:483) - dfs.domain.socket.path = /var/lib/hadoop-hdfs/dn_socket DEBUG [2020-04-01 14:52:41,067] ({main} RetryUtils.java[getDefaultRetryPolicy]:75) - multipleLinearRandomRetry = null DEBUG [2020-04-01 14:52:41,142] ({Finalizer} NativeAzureFileSystem.java[finalize]:3880) - finalize() called. DEBUG [2020-04-01 14:52:41,149] ({Finalizer} NativeAzureFileSystem.java[finalize]:3880) - finalize() called. DEBUG [2020-04-01 14:52:41,211] ({main} Server.java[registerProtocolEngine]:275) - rpcKind=RPC_PROTOCOL_BUFFER, rpcRequestWrapperClass=class org.apache.hadoop.ipc.ProtobufRpcEngine$RpcRequestWrapper, rpcInvoker=org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker@191ae03f DEBUG [2020-04-01 14:52:41,214] ({main} ClientCache.java[getClient]:63) - getting client out of cache: org.apache.hadoop.ipc.Client@35fe2125 DEBUG [2020-04-01 14:52:41,417] ({client DomainSocketWatcher} DomainSocketWatcher.java[run]:447) - org.apache.hadoop.net.unix.DomainSocketWatcher$2@2f0cb6f6: starting with interruptCheckPeriodMs = 60000 DEBUG [2020-04-01 14:52:41,421] ({main} DomainSocketFactory.java[<init>]:120) - The short-circuit local reads feature is enabled. DEBUG [2020-04-01 14:52:41,425] ({main} DataTransferSaslUtil.java[getSaslPropertiesResolver]:183) - DataTransferProtocol not using SaslPropertiesResolver, no QOP found in configuration for dfs.data.transfer.protection DEBUG [2020-04-01 14:52:41,452] ({main} Client.java[<init>]:465) - The ping interval is 60000 ms. DEBUG [2020-04-01 14:52:41,454] ({main} Client.java[setupIOstreams]:737) - Connecting to dwh-prod-ht-master-01.mynetwork-dc.local/192.168.200.21:8020 DEBUG [2020-04-01 14:52:41,465] ({main} UserGroupInformation.java[logPrivilegedAction]:1896) - PrivilegedAction as:mvince (auth:SIMPLE) from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:758) DEBUG [2020-04-01 14:52:41,524] ({main} SaslRpcClient.java[sendSaslMessage]:457) - Sending sasl message state: NEGOTIATE DEBUG [2020-04-01 14:52:41,530] ({main} SaslRpcClient.java[saslConnect]:389) - Received SASL message state: NEGOTIATE auths { method: "TOKEN" mechanism: "DIGEST-MD5" protocol: "" serverId: "default" challenge: "realm=\"default\",nonce=\"kjhasdjknasjn+jkhdsjhdsjnjksdnjjksd\",qop=\"auth\",charset=utf-8,algorithm=md5-sess" } auths { method: "KERBEROS" mechanism: "GSSAPI" protocol: "nn" serverId: "dwh-prod-ht-master-01.mynetwork-dc.local" } DEBUG [2020-04-01 14:52:41,531] ({main} SaslRpcClient.java[getServerToken]:264) - Get token info proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB info:@org.apache.hadoop.security.token.TokenInfo(value=class org.apache.hadoop.hdfs.security.token.delegation.DelegationTokenSelector) DEBUG [2020-04-01 14:52:41,536] ({main} SaslRpcClient.java[createSaslClient]:247) - Creating SASL DIGEST-MD5(TOKEN) client to authenticate to service at default DEBUG [2020-04-01 14:52:41,537] ({main} SaslRpcClient.java[selectSaslClient]:176) - Use TOKEN authentication for protocol ClientNamenodeProtocolPB DEBUG [2020-04-01 14:52:41,538] ({main} SaslRpcClient.java[handle]:683) - SASL client callback: setting username: jdisajisadjosaosdjaijdsjasdoijsaiojsadijdsaijsdai== DEBUG [2020-04-01 14:52:41,538] ({main} SaslRpcClient.java[handle]:688) - SASL client callback: setting userPassword DEBUG [2020-04-01 14:52:41,539] ({main} SaslRpcClient.java[handle]:693) - SASL client callback: setting realm: default DEBUG [2020-04-01 14:52:41,540] ({main} SaslRpcClient.java[sendSaslMessage]:457) - Sending sasl message state: INITIATE token: "charset=utf-8,username=\"jdisajisadjosaosdjaijdsjasdoijsaiojsadijdsaijsdai==\",realm=\"default\",nonce=\"6humwQry+j9f3j28chdhsucjsx\",nc=00000001,cnonce=\"izPh9q+w/eyxfwxy4fKgWDcYJQcjISX80YHcSTEL\",digest-uri=\"/default\",maxbuf=65536,response=e544dcf1bbac8207d73e3e6727486369,qop=auth" auths { method: "TOKEN" mechanism: "DIGEST-MD5" protocol: "" serverId: "default" } DEBUG [2020-04-01 14:52:41,541] ({main} SaslRpcClient.java[saslConnect]:389) - Received SASL message state: SUCCESS token: "rspauth=6767897987898ashash89sx9sasa98sa8" DEBUG [2020-04-01 14:52:41,541] ({main} Client.java[setupIOstreams]:781) - Negotiated QOP is :auth DEBUG [2020-04-01 14:52:41,546] ({IPC Client (176955204) connection to dwh-prod-ht-master-01.mynetwork-dc.local/192.168.200.21:8020 from mvince} Client.java[run]:1009) - IPC Client (176955204) connection to dwh-prod-ht-master-01.mynetwork-dc.local/192.168.200.21:8020 from mvince: starting, having connections 1 DEBUG [2020-04-01 14:52:41,547] ({IPC Parameter Sending Thread #0} Client.java[run]:1072) - IPC Client (176955204) connection to dwh-prod-ht-master-01.mynetwork-dc.local/192.168.200.21:8020 from mvince sending #0 DEBUG [2020-04-01 14:52:41,548] ({IPC Client (176955204) connection to dwh-prod-ht-master-01.mynetwork-dc.local/192.168.200.21:8020 from mvince} Client.java[receiveRpcResponse]:1129) - IPC Client (176955204) connection to dwh-prod-ht-master-01.mynetwork-dc.local/192.168.200.21:8020 from mvince got value #0 DEBUG [2020-04-01 14:52:41,548] ({main} ProtobufRpcEngine.java[invoke]:254) - Call: getFileInfo took 116ms DEBUG [2020-04-01 14:52:41,601] ({main} UserGroupInformation.java[logPrivilegedAction]:1896) - PrivilegedAction as:mvince (auth:SIMPLE) from:org.apache.spark.deploy.yarn.ApplicationMaster.doAsUser(ApplicationMaster.scala:814) INFO [2020-04-01 14:52:41,656] ({main} Logging.scala[logInfo]:54) - ApplicationAttemptId: appattempt_1573680659125_670094_000001 DEBUG [2020-04-01 14:52:41,657] ({main} Logging.scala[logDebug]:58) - Adding shutdown hook INFO [2020-04-01 14:52:41,667] ({main} Logging.scala[logInfo]:54) - Starting the user application in a separate Thread INFO [2020-04-01 14:52:41,698] ({main} Logging.scala[logInfo]:54) - Waiting for spark context initialization... WARN [2020-04-01 14:52:41,783] ({Driver} ZeppelinConfiguration.java[create]:159) - Failed to load configuration, proceeding with a default INFO [2020-04-01 14:52:41,824] ({Driver} ZeppelinConfiguration.java[create]:171) - Server Host: 127.0.0.1 INFO [2020-04-01 14:52:41,824] ({Driver} ZeppelinConfiguration.java[create]:173) - Server Port: 8080 INFO [2020-04-01 14:52:41,824] ({Driver} ZeppelinConfiguration.java[create]:177) - Context Path: / INFO [2020-04-01 14:52:41,824] ({Driver} ZeppelinConfiguration.java[create]:178) - Zeppelin Version: 0.9.0-SNAPSHOT INFO [2020-04-01 14:52:41,825] ({Driver} RemoteInterpreterServer.java[<init>]:161) - Starting remote interpreter server on port 0, intpEventServerAddress: 192.168.200.65:38198 INFO [2020-04-01 14:52:41,854] ({Driver} RemoteInterpreterServer.java[<init>]:188) - Launching ThriftServer at 192.168.200.32:45956 INFO [2020-04-01 14:52:42,206] ({pool-5-thread-1} RemoteInterpreterServer.java[createInterpreter]:408) - Instantiate interpreter org.apache.zeppelin.spark.SparkInterpreter DEBUG [2020-04-01 14:52:42,206] ({pool-5-thread-1} InterpreterGroup.java[addInterpreterToSession]:91) - Add Interpreter org.apache.zeppelin.spark.SparkInterpreter to session shared_session INFO [2020-04-01 14:52:42,212] ({pool-5-thread-1} RemoteInterpreterServer.java[createInterpreter]:408) - Instantiate interpreter org.apache.zeppelin.spark.SparkSqlInterpreter DEBUG [2020-04-01 14:52:42,212] ({pool-5-thread-1} InterpreterGroup.java[addInterpreterToSession]:91) - Add Interpreter org.apache.zeppelin.spark.SparkSqlInterpreter to session shared_session INFO [2020-04-01 14:52:42,218] ({pool-5-thread-1} RemoteInterpreterServer.java[createInterpreter]:408) - Instantiate interpreter org.apache.zeppelin.spark.PySparkInterpreter DEBUG [2020-04-01 14:52:42,219] ({pool-5-thread-1} InterpreterGroup.java[addInterpreterToSession]:91) - Add Interpreter org.apache.zeppelin.spark.PySparkInterpreter to session shared_session INFO [2020-04-01 14:52:42,226] ({pool-5-thread-1} RemoteInterpreterServer.java[createInterpreter]:408) - Instantiate interpreter org.apache.zeppelin.spark.IPySparkInterpreter DEBUG [2020-04-01 14:52:42,226] ({pool-5-thread-1} InterpreterGroup.java[addInterpreterToSession]:91) - Add Interpreter org.apache.zeppelin.spark.IPySparkInterpreter to session shared_session INFO [2020-04-01 14:52:42,231] ({pool-5-thread-1} RemoteInterpreterServer.java[createInterpreter]:408) - Instantiate interpreter org.apache.zeppelin.spark.SparkRInterpreter DEBUG [2020-04-01 14:52:42,231] ({pool-5-thread-1} InterpreterGroup.java[addInterpreterToSession]:91) - Add Interpreter org.apache.zeppelin.spark.SparkRInterpreter to session shared_session INFO [2020-04-01 14:52:42,234] ({pool-5-thread-1} RemoteInterpreterServer.java[createInterpreter]:408) - Instantiate interpreter org.apache.zeppelin.spark.SparkIRInterpreter DEBUG [2020-04-01 14:52:42,234] ({pool-5-thread-1} InterpreterGroup.java[addInterpreterToSession]:91) - Add Interpreter org.apache.zeppelin.spark.SparkIRInterpreter to session shared_session INFO [2020-04-01 14:52:42,238] ({pool-5-thread-1} RemoteInterpreterServer.java[createInterpreter]:408) - Instantiate interpreter org.apache.zeppelin.spark.SparkShinyInterpreter DEBUG [2020-04-01 14:52:42,238] ({pool-5-thread-1} InterpreterGroup.java[addInterpreterToSession]:91) - Add Interpreter org.apache.zeppelin.spark.SparkShinyInterpreter to session shared_session DEBUG [2020-04-01 14:52:42,243] ({pool-5-thread-1} KotlinSparkInterpreter.java[<init>]:62) - Creating KotlinSparkInterpreter INFO [2020-04-01 14:52:42,340] ({pool-5-thread-1} RemoteInterpreterServer.java[createInterpreter]:408) - Instantiate interpreter org.apache.zeppelin.spark.KotlinSparkInterpreter DEBUG [2020-04-01 14:52:42,340] ({pool-5-thread-1} InterpreterGroup.java[addInterpreterToSession]:91) - Add Interpreter org.apache.zeppelin.spark.KotlinSparkInterpreter to session shared_session INFO [2020-04-01 14:52:42,395] ({pool-5-thread-2} SchedulerFactory.java[<init>]:62) - Scheduler Thread Pool Size: 100 DEBUG [2020-04-01 14:52:42,400] ({pool-5-thread-2} Interpreter.java[getProperty]:204) - key: zeppelin.spark.concurrentSQL, value: false DEBUG [2020-04-01 14:52:42,429] ({pool-5-thread-2} RemoteInterpreterServer.java[interpret]:523) - st: val x = "hello world teraz sem pisem zmeny velke, sem pisem sovje zmeny, tuktk tuk tu," print(x) INFO [2020-04-01 14:52:42,484] ({FIFOScheduler-interpreter_1074544391-Worker-1} AbstractScheduler.java[runJob]:125) - Job paragraph_1571833609109_-338708939 started by scheduler interpreter_1074544391 INFO [2020-04-01 14:52:42,498] ({FIFOScheduler-interpreter_1074544391-Worker-1} SparkInterpreter.java[extractScalaVersion]:233) - Using Scala: version 2.11.8 DEBUG [2020-04-01 14:52:42,517] ({FIFOScheduler-interpreter_1074544391-Worker-1} Interpreter.java[getProperty]:204) - key: zeppelin.interpreter.localRepo, value: /opt/zeppelin-server/local-repo/spark INFO [2020-04-01 14:52:42,524] ({FIFOScheduler-interpreter_1074544391-Worker-1} SparkScala211Interpreter.scala[open]:62) - Scala shell repl output dir: /data/01/yarn/local/usercache/mvince/appcache/application_1573680659125_670094/container_e150_1573680659125_670094_01_000001/tmp/spark8391684309472116098 DEBUG [2020-04-01 14:52:42,763] ({FIFOScheduler-interpreter_1074544391-Worker-1} BaseSparkScalaInterpreter.scala[getUserJars]:463) - User jar for spark repl: file:/data/01/yarn/local/usercache/mvince/appcache/application_1573680659125_670094/container_e150_1573680659125_670094_01_000001/__app__.jar,file:/data/01/yarn/local/usercache/mvince/appcache/application_1573680659125_670094/container_e150_1573680659125_670094_01_000001/spark-scala-2.11-0.9.0-SNAPSHOT.jar,file:/data/01/yarn/local/usercache/mvince/appcache/application_1573680659125_670094/container_e150_1573680659125_670094_01_000001/zeppelin-interpreter-shaded-0.9.0-SNAPSHOT.jar,/data/01/yarn/local/usercache/mvince/appcache/application_1573680659125_670094/container_e150_1573680659125_670094_01_000001/__app__.jar,/data/01/yarn/local/usercache/mvince/appcache/application_1573680659125_670094/container_e150_1573680659125_670094_01_000001/spark-scala-2.11-0.9.0-SNAPSHOT.jar,/data/01/yarn/local/usercache/mvince/appcache/application_1573680659125_670094/container_e150_1573680659125_670094_01_000001/zeppelin-interpreter-shaded-0.9.0-SNAPSHOT.jar INFO [2020-04-01 14:52:42,764] ({FIFOScheduler-interpreter_1074544391-Worker-1} SparkScala211Interpreter.scala[open]:77) - UserJars: file:/data/01/yarn/local/usercache/mvince/appcache/application_1573680659125_670094/container_e150_1573680659125_670094_01_000001/__app__.jar:file:/data/01/yarn/local/usercache/mvince/appcache/application_1573680659125_670094/container_e150_1573680659125_670094_01_000001/spark-scala-2.11-0.9.0-SNAPSHOT.jar:file:/data/01/yarn/local/usercache/mvince/appcache/application_1573680659125_670094/container_e150_1573680659125_670094_01_000001/zeppelin-interpreter-shaded-0.9.0-SNAPSHOT.jar:/data/01/yarn/local/usercache/mvince/appcache/application_1573680659125_670094/container_e150_1573680659125_670094_01_000001/__app__.jar:/data/01/yarn/local/usercache/mvince/appcache/application_1573680659125_670094/container_e150_1573680659125_670094_01_000001/spark-scala-2.11-0.9.0-SNAPSHOT.jar:/data/01/yarn/local/usercache/mvince/appcache/application_1573680659125_670094/container_e150_1573680659125_670094_01_000001/zeppelin-interpreter-shaded-0.9.0-SNAPSHOT.jar [init] [31merror: [0merror while loading <root>, Error accessing /data/01/yarn/local/usercache/mvince/appcache/application_1573680659125_670094/container_e150_1573680659125_670094_01_000001/__spark_libs__/dummy.jar Failed to initialize compiler: object java.lang.Object in compiler mirror not found. ** Note that as of 2.8 scala does not assume use of the java classpath. ** For the old behavior pass -usejavacp to scala, or if using a Settings ** object programmatically, settings.usejavacp.value = true. Failed to initialize compiler: object java.lang.Object in compiler mirror not found. ** Note that as of 2.8 scala does not assume use of the java classpath. ** For the old behavior pass -usejavacp to scala, or if using a Settings ** object programmatically, settings.usejavacp.value = true. ERROR [2020-04-01 14:52:44,394] ({FIFOScheduler-interpreter_1074544391-Worker-1} SparkInterpreter.java[open]:113) - Fail to open SparkInterpreter scala.reflect.internal.MissingRequirementError: object java.lang.Object in compiler mirror not found. at scala.reflect.internal.MissingRequirementError$.signal(MissingRequirementError.scala:17) at scala.reflect.internal.MissingRequirementError$.notFound(MissingRequirementError.scala:18) at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:53) at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:45) at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:45) at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:66) at scala.reflect.internal.Mirrors$RootsBase.getClassByName(Mirrors.scala:102) at scala.reflect.internal.Mirrors$RootsBase.getRequiredClass(Mirrors.scala:105) at scala.reflect.internal.Definitions$DefinitionsClass.ObjectClass$lzycompute(Definitions.scala:257) at scala.reflect.internal.Definitions$DefinitionsClass.ObjectClass(Definitions.scala:257) at scala.reflect.internal.Definitions$DefinitionsClass.init(Definitions.scala:1394) at scala.tools.nsc.Global$Run.<init>(Global.scala:1215) at scala.tools.nsc.interpreter.IMain.compileSourcesKeepingRun(IMain.scala:432) at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.compileAndSaveRun(IMain.scala:855) at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.compile(IMain.scala:813) at scala.tools.nsc.interpreter.IMain.bind(IMain.scala:675) at scala.tools.nsc.interpreter.IMain.bind(IMain.scala:712) at scala.tools.nsc.interpreter.IMain$$anonfun$quietBind$1.apply(IMain.scala:711) at scala.tools.nsc.interpreter.IMain$$anonfun$quietBind$1.apply(IMain.scala:711) at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214) at scala.tools.nsc.interpreter.IMain.quietBind(IMain.scala:711) at org.apache.zeppelin.spark.SparkScala211Interpreter$.loopPostInit$1(SparkScala211Interpreter.scala:164) at org.apache.zeppelin.spark.SparkScala211Interpreter$.org$apache$zeppelin$spark$SparkScala211Interpreter$$loopPostInit(SparkScala211Interpreter.scala:199) at org.apache.zeppelin.spark.SparkScala211Interpreter.open(SparkScala211Interpreter.scala:95) at org.apache.zeppelin.spark.SparkInterpreter.open(SparkInterpreter.java:98) at org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:70) at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:668) at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:577) at org.apache.zeppelin.scheduler.Job.run(Job.java:172) at org.apache.zeppelin.scheduler.AbstractScheduler.runJob(AbstractScheduler.java:130) at org.apache.zeppelin.scheduler.FIFOScheduler.lambda$runJobInScheduler$0(FIFOScheduler.java:39) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) INFO [2020-04-01 14:52:44,396] ({FIFOScheduler-interpreter_1074544391-Worker-1} SparkInterpreter.java[close]:159) - Close SparkInterpreter DEBUG [2020-04-01 14:52:44,404] ({FIFOScheduler-interpreter_1074544391-Worker-1} AbstractScheduler.java[runJob]:143) - Job Error, paragraph_1571833609109_-338708939, %text org.apache.zeppelin.interpreter.InterpreterException: org.apache.zeppelin.interpreter.InterpreterException: Fail to open SparkInterpreter at org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:76) at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:668) at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:577) at org.apache.zeppelin.scheduler.Job.run(Job.java:172) at org.apache.zeppelin.scheduler.AbstractScheduler.runJob(AbstractScheduler.java:130) at org.apache.zeppelin.scheduler.FIFOScheduler.lambda$runJobInScheduler$0(FIFOScheduler.java:39) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Caused by: org.apache.zeppelin.interpreter.InterpreterException: Fail to open SparkInterpreter at org.apache.zeppelin.spark.SparkInterpreter.open(SparkInterpreter.java:114) at org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:70) ... 8 more Caused by: scala.reflect.internal.MissingRequirementError: object java.lang.Object in compiler mirror not found. at scala.reflect.internal.MissingRequirementError$.signal(MissingRequirementError.scala:17) at scala.reflect.internal.MissingRequirementError$.notFound(MissingRequirementError.scala:18) at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:53) at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:45) at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:45) at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:66) at scala.reflect.internal.Mirrors$RootsBase.getClassByName(Mirrors.scala:102) at scala.reflect.internal.Mirrors$RootsBase.getRequiredClass(Mirrors.scala:105) at scala.reflect.internal.Definitions$DefinitionsClass.ObjectClass$lzycompute(Definitions.scala:257) at scala.reflect.internal.Definitions$DefinitionsClass.ObjectClass(Definitions.scala:257) at scala.reflect.internal.Definitions$DefinitionsClass.init(Definitions.scala:1394) at scala.tools.nsc.Global$Run.<init>(Global.scala:1215) at scala.tools.nsc.interpreter.IMain.compileSourcesKeepingRun(IMain.scala:432) at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.compileAndSaveRun(IMain.scala:855) at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.compile(IMain.scala:813) at scala.tools.nsc.interpreter.IMain.bind(IMain.scala:675) at scala.tools.nsc.interpreter.IMain.bind(IMain.scala:712) at scala.tools.nsc.interpreter.IMain$$anonfun$quietBind$1.apply(IMain.scala:711) at scala.tools.nsc.interpreter.IMain$$anonfun$quietBind$1.apply(IMain.scala:711) at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214) at scala.tools.nsc.interpreter.IMain.quietBind(IMain.scala:711) at org.apache.zeppelin.spark.SparkScala211Interpreter$.loopPostInit$1(SparkScala211Interpreter.scala:164) at org.apache.zeppelin.spark.SparkScala211Interpreter$.org$apache$zeppelin$spark$SparkScala211Interpreter$$loopPostInit(SparkScala211Interpreter.scala:199) at org.apache.zeppelin.spark.SparkScala211Interpreter.open(SparkScala211Interpreter.scala:95) at org.apache.zeppelin.spark.SparkInterpreter.open(SparkInterpreter.java:98) ... 9 more INFO [2020-04-01 14:52:44,404] ({FIFOScheduler-interpreter_1074544391-Worker-1} AbstractScheduler.java[runJob]:152) - Job paragraph_1571833609109_-338708939 finished by scheduler interpreter_1074544391 DEBUG [2020-04-01 14:52:44,597] ({pool-5-thread-2} RemoteInterpreterServer.java[resourcePoolGetAll]:1014) - Request resourcePoolGetAll from ZeppelinServer DEBUG [2020-04-01 14:53:11,566] ({IPC Client (176955204) connection to dwh-prod-ht-master-01.mynetwork-dc.local/192.168.200.21:8020 from mvince} Client.java[close]:1226) - IPC Client (176955204) connection to dwh-prod-ht-master-01.mynetwork-dc.local/192.168.200.21:8020 from mvince: closed DEBUG [2020-04-01 14:53:11,566] ({IPC Client (176955204) connection to dwh-prod-ht-master-01.mynetwork-dc.local/192.168.200.21:8020 from mvince} Client.java[run]:1027) - IPC Client (176955204) connection to dwh-prod-ht-master-01.mynetwork-dc.local/192.168.200.21:8020 from mvince: stopped, remaining connections 0 ERROR [2020-04-01 14:54:21,726] ({main} Logging.scala[logError]:91) - Uncaught exception: java.util.concurrent.TimeoutException: Futures timed out after [100000 milliseconds] at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219) at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223) at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:201) at org.apache.spark.deploy.yarn.ApplicationMaster.runDriver(ApplicationMaster.scala:498) at org.apache.spark.deploy.yarn.ApplicationMaster.org$apache$spark$deploy$yarn$ApplicationMaster$$runImpl(ApplicationMaster.scala:345) at org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$run$2.apply$mcV$sp(ApplicationMaster.scala:260) at org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$run$2.apply(ApplicationMaster.scala:260) at org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$run$2.apply(ApplicationMaster.scala:260) at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$5.run(ApplicationMaster.scala:815) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1869) at org.apache.spark.deploy.yarn.ApplicationMaster.doAsUser(ApplicationMaster.scala:814) at org.apache.spark.deploy.yarn.ApplicationMaster.run(ApplicationMaster.scala:259) at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:839) at org.apache.spark.deploy.yarn.ApplicationMaster.main(ApplicationMaster.scala) INFO [2020-04-01 14:54:21,727] ({main} Logging.scala[logInfo]:54) - Final app status: FAILED, exitCode: 13, (reason: Uncaught exception: java.util.concurrent.TimeoutException: Futures timed out after [100000 milliseconds]) DEBUG [2020-04-01 14:54:21,727] ({main} Logging.scala[logDebug]:58) - shutting down user thread INFO [2020-04-01 14:54:21,732] ({pool-1-thread-1} Logging.scala[logInfo]:54) - Deleting staging directory hdfs://dwhprodht/user/mvince/.sparkStaging/application_1573680659125_670094 DEBUG [2020-04-01 14:54:21,734] ({pool-1-thread-1} Client.java[<init>]:465) - The ping interval is 60000 ms. DEBUG [2020-04-01 14:54:21,734] ({pool-1-thread-1} Client.java[setupIOstreams]:737) - Connecting to dwh-prod-ht-master-01.mynetwork-dc.local/192.168.200.21:8020 DEBUG [2020-04-01 14:54:21,735] ({pool-1-thread-1} UserGroupInformation.java[logPrivilegedAction]:1896) - PrivilegedAction as:mvince (auth:SIMPLE) from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:758) DEBUG [2020-04-01 14:54:21,736] ({pool-1-thread-1} SaslRpcClient.java[sendSaslMessage]:457) - Sending sasl message state: NEGOTIATE DEBUG [2020-04-01 14:54:21,737] ({pool-1-thread-1} SaslRpcClient.java[saslConnect]:389) - Received SASL message state: NEGOTIATE auths { method: "TOKEN" mechanism: "DIGEST-MD5" protocol: "" serverId: "default" challenge: "realm=\"default\",nonce=\"4kzuNdnplBFu+kdsalpfelwp[fewo,wdodwo\",qop=\"auth\",charset=utf-8,algorithm=md5-sess" } auths { method: "KERBEROS" mechanism: "GSSAPI" protocol: "nn" serverId: "dwh-prod-ht-master-01.mynetwork-dc.local" } DEBUG [2020-04-01 14:54:21,737] ({pool-1-thread-1} SaslRpcClient.java[getServerToken]:264) - Get token info proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB info:@org.apache.hadoop.security.token.TokenInfo(value=class org.apache.hadoop.hdfs.security.token.delegation.DelegationTokenSelector) DEBUG [2020-04-01 14:54:21,738] ({pool-1-thread-1} SaslRpcClient.java[createSaslClient]:247) - Creating SASL DIGEST-MD5(TOKEN) client to authenticate to service at default DEBUG [2020-04-01 14:54:21,738] ({pool-1-thread-1} SaslRpcClient.java[selectSaslClient]:176) - Use TOKEN authentication for protocol ClientNamenodeProtocolPB DEBUG [2020-04-01 14:54:21,739] ({pool-1-thread-1} SaslRpcClient.java[handle]:683) - SASL client callback: setting username: jdisajisadjosaosdjaijdsjasdoijsaiojsadijdsaijsdai== DEBUG [2020-04-01 14:54:21,739] ({pool-1-thread-1} SaslRpcClient.java[handle]:688) - SASL client callback: setting userPassword DEBUG [2020-04-01 14:54:21,739] ({pool-1-thread-1} SaslRpcClient.java[handle]:693) - SASL client callback: setting realm: default DEBUG [2020-04-01 14:54:21,740] ({pool-1-thread-1} SaslRpcClient.java[sendSaslMessage]:457) - Sending sasl message state: INITIATE token: "charset=utf-8,username=\"jdisajisadjosaosdjaijdsjasdoijsaiojsadijdsaijsdai==\",realm=\"default\",nonce=\"4kzuNdnplBFu+kdsalpfelwp[fewo,wdodwo\",nc=00000001,cnonce=\"ufewijweij0932jd9dd+2sdads0asdi0\",digest-uri=\"/default\",maxbuf=65536,response=ad252d5ddf8d9be67a741cb9c2e27595,qop=auth" auths { method: "TOKEN" mechanism: "DIGEST-MD5" protocol: "" serverId: "default" } DEBUG [2020-04-01 14:54:21,741] ({pool-1-thread-1} SaslRpcClient.java[saslConnect]:389) - Received SASL message state: SUCCESS token: "rspauth=c72ffa1d7dd98d78dde18ffc147357cd" DEBUG [2020-04-01 14:54:21,742] ({pool-1-thread-1} Client.java[setupIOstreams]:781) - Negotiated QOP is :auth DEBUG [2020-04-01 14:54:21,742] ({IPC Client (176955204) connection to dwh-prod-ht-master-01.mynetwork-dc.local/192.168.200.21:8020 from mvince} Client.java[run]:1009) - IPC Client (176955204) connection to dwh-prod-ht-master-01.mynetwork-dc.local/192.168.200.21:8020 from mvince: starting, having connections 1 DEBUG [2020-04-01 14:54:21,743] ({IPC Parameter Sending Thread #1} Client.java[run]:1072) - IPC Client (176955204) connection to dwh-prod-ht-master-01.mynetwork-dc.local/192.168.200.21:8020 from mvince sending #1 DEBUG [2020-04-01 14:54:21,745] ({IPC Client (176955204) connection to dwh-prod-ht-master-01.mynetwork-dc.local/192.168.200.21:8020 from mvince} Client.java[receiveRpcResponse]:1129) - IPC Client (176955204) connection to dwh-prod-ht-master-01.mynetwork-dc.local/192.168.200.21:8020 from mvince got value #1 DEBUG [2020-04-01 14:54:21,745] ({pool-1-thread-1} ProtobufRpcEngine.java[invoke]:254) - Call: delete took 11ms INFO [2020-04-01 14:54:21,747] ({pool-1-thread-1} Logging.scala[logInfo]:54) - Shutdown hook called DEBUG [2020-04-01 14:54:21,749] ({pool-1-thread-1} ClientCache.java[stopClient]:97) - stopping client from cache: org.apache.hadoop.ipc.Client@35fe2125 DEBUG [2020-04-01 14:54:21,749] ({pool-1-thread-1} ClientCache.java[stopClient]:103) - removing client from cache: org.apache.hadoop.ipc.Client@35fe2125 DEBUG [2020-04-01 14:54:21,749] ({pool-1-thread-1} ClientCache.java[stopClient]:110) - stopping actual client because no more references remain: org.apache.hadoop.ipc.Client@35fe2125 DEBUG [2020-04-01 14:54:21,749] ({pool-1-thread-1} Client.java[stop]:1279) - Stopping client DEBUG [2020-04-01 14:54:21,751] ({IPC Client (176955204) connection to dwh-prod-ht-master-01.mynetwork-dc.local/192.168.200.21:8020 from mvince} Client.java[close]:1226) - IPC Client (176955204) connection to dwh-prod-ht-master-01.mynetwork-dc.local/192.168.200.21:8020 from mvince: closed DEBUG [2020-04-01 14:54:21,751] ({IPC Client (176955204) connection to dwh-prod-ht-master-01.mynetwork-dc.local/192.168.200.21:8020 from mvince} Client.java[run]:1027) - IPC Client (176955204) connection to dwh-prod-ht-master-01.mynetwork-dc.local/192.168.200.21:8020 from mvince: stopped, remaining connections 0 DEBUG [2020-04-01 14:54:21,852] ({Thread-5} ShutdownHookManager.java[run]:84) - ShutdownHookManger complete shutdown.