Nassir created ZEPPELIN-2740:
--------------------------------

             Summary: Pyspark not working error thrown after installing Zeppelin
                 Key: ZEPPELIN-2740
                 URL: https://issues.apache.org/jira/browse/ZEPPELIN-2740
             Project: Zeppelin
          Issue Type: Bug
            Reporter: Nassir


Hi,

I get this error message when running a simple script cell

%pyspark
x = 5

error: pyspark is not responding

Some logs output in command window are below if useful:

 ZeppelinServer
DEBUG [2017-07-06 11:16:21,207] ({Thread-39} 
InterpreterSettingManager.java[getInterpreterSessionKey]:831) - Interpreter 
session key: shared_session, for note: 2CNACUTPT, user: anonymous, 
InterpreterSetting Name: spark
DEBUG [2017-07-06 11:16:21,209] ({Thread-41} 
InterpreterSettingManager.java[getInterpreterSessionKey]:831) - Interpreter 
session key: shared_session, for note: 2CNACUTPT, user: anonymous, 
InterpreterSetting Name: spark
DEBUG [2017-07-06 11:16:21,230] ({pool-4-thread-1} 
AppendOutputRunner.java[run]:91) - Processing time for append-output took 0 
milliseconds
DEBUG [2017-07-06 11:16:21,231] ({pool-4-thread-1} 
AppendOutputRunner.java[run]:107) - Processing size for append-output is 725 
characters
DEBUG [2017-07-06 11:16:21,590] ({Exec Stream Pumper} 
RemoteInterpreterManagedProcess.java[processLine]:206) -  INFO [2017-07-06 
11:16:21,590] ({pool-2-thread-4} Logging.scala[logInfo]:54) - Starting job: 
count at <console>:30
DEBUG [2017-07-06 11:16:21,603] ({Exec Stream Pumper} 
RemoteInterpreterManagedProcess.java[processLine]:206) -  INFO [2017-07-06 
11:16:21,603] ({dag-scheduler-event-loop} Logging.scala[logInfo]:54) - Got job 
0 (count at <console>:30) with 8 output partitions
DEBUG [2017-07-06 11:16:21,603] ({Exec Stream Pumper} 
RemoteInterpreterManagedProcess.java[processLine]:206) -  INFO [2017-07-06 
11:16:21,603] ({dag-scheduler-event-loop} Logging.scala[logInfo]:54) - Final 
stage: ResultStage 0 (count at <console>:30)
DEBUG [2017-07-06 11:16:21,604] ({Exec Stream Pumper} 
RemoteInterpreterManagedProcess.java[processLine]:206) -  INFO [2017-07-06 
11:16:21,604] ({dag-scheduler-event-loop} Logging.scala[logInfo]:54) - Parents 
of final stage: List()
DEBUG [2017-07-06 11:16:21,607] ({Exec Stream Pumper} 
RemoteInterpreterManagedProcess.java[processLine]:206) -  INFO [2017-07-06 
11:16:21,606] ({dag-scheduler-event-loop} Logging.scala[logInfo]:54) - Missing 
parents: List()
DEBUG [2017-07-06 11:16:21,610] ({Exec Stream Pumper} 
RemoteInterpreterManagedProcess.java[processLine]:206) -  INFO [2017-07-06 
11:16:21,610] ({dag-scheduler-event-loop} Logging.scala[logInfo]:54) - 
Submitting ResultStage 0 (ParallelCollectionRDD[0] at parallelize at 
<console>:27), which has no missing parents
DEBUG [2017-07-06 11:16:21,708] ({Thread-39} 
InterpreterSettingManager.java[getInterpreterSessionKey]:831) - Interpreter 
session key: shared_session, for note: 2CNACUTPT, user: anonymous, 
InterpreterSetting Name: spark
DEBUG [2017-07-06 11:16:21,712] ({Thread-41} 
InterpreterSettingManager.java[getInterpreterSessionKey]:831) - Interpreter 
session key: shared_session, for note: 2CNACUTPT, user: anonymous, 
InterpreterSetting Name: spark
DEBUG [2017-07-06 11:16:21,735] ({Exec Stream Pumper} 
RemoteInterpreterManagedProcess.java[processLine]:206) -  INFO [2017-07-06 
11:16:21,735] ({dag-scheduler-event-loop} Logging.scala[logInfo]:54) - Block 
broadcast_0 stored as values in memory (estimated size 1216.0 B, free 408.9 MB)
DEBUG [2017-07-06 11:16:21,767] ({Exec Stream Pumper} 
RemoteInterpreterManagedProcess.java[processLine]:206) -  INFO [2017-07-06 
11:16:21,767] ({dag-scheduler-event-loop} Logging.scala[logInfo]:54) - Block 
broadcast_0_piece0 stored as bytes in memory (estimated size 879.0 B, free 
408.9 MB)
DEBUG [2017-07-06 11:16:21,770] ({Exec Stream Pumper} 
RemoteInterpreterManagedProcess.java[processLine]:206) -  INFO [2017-07-06 
11:16:21,770] ({dispatcher-event-loop-4} Logging.scala[logInfo]:54) - Added 
broadcast_0_piece0 in memory on 192.168.11.1:7299 (size: 879.0 B, free: 408.9 
MB)
DEBUG [2017-07-06 11:16:21,775] ({Exec Stream Pumper} 
RemoteInterpreterManagedProcess.java[processLine]:206) -  INFO [2017-07-06 
11:16:21,774] ({dag-scheduler-event-loop} Logging.scala[logInfo]:54) - Created 
broadcast 0 from broadcast at DAGScheduler.scala:996
DEBUG [2017-07-06 11:16:21,778] ({Exec Stream Pumper} 
RemoteInterpreterManagedProcess.java[processLine]:206) -  INFO [2017-07-06 
11:16:21,778] ({dag-scheduler-event-loop} Logging.scala[logInfo]:54) - 
Submitting 8 missing tasks from ResultStage 0 (ParallelCollectionRDD[0] at 
parallelize at <console>:27)
DEBUG [2017-07-06 11:16:21,779] ({Exec Stream Pumper} 
RemoteInterpreterManagedProcess.java[processLine]:206) -  INFO [2017-07-06 
11:16:21,779] ({dag-scheduler-event-loop} Logging.scala[logInfo]:54) - Adding 
task set 0.0 with 8 tasks
DEBUG [2017-07-06 11:16:21,788] ({Exec Stream Pumper} 
RemoteInterpreterManagedProcess.java[processLine]:206) -  INFO [2017-07-06 
11:16:21,788] ({dag-scheduler-event-loop} Logging.scala[logInfo]:54) - Added 
task set TaskSet_0.0 tasks to pool default
DEBUG [2017-07-06 11:16:21,837] ({Exec Stream Pumper} 
RemoteInterpreterManagedProcess.java[processLine]:206) -  INFO [2017-07-06 
11:16:21,837] ({dispatcher-event-loop-5} Logging.scala[logInfo]:54) - Starting 
task 0.0 in stage 0.0 (TID 0, localhost, executor driver, partition 0, 
PROCESS_LOCAL, 6094 bytes)
DEBUG [2017-07-06 11:16:21,841] ({Exec Stream Pumper} 
RemoteInterpreterManagedProcess.java[processLine]:206) -  INFO [2017-07-06 
11:16:21,841] ({dispatcher-event-loop-5} Logging.scala[logInfo]:54) - Starting 
task 1.0 in stage 0.0 (TID 1, localhost, executor driver, partition 1, 
PROCESS_LOCAL, 6098 bytes)
DEBUG [2017-07-06 11:16:21,844] ({Exec Stream Pumper} 
RemoteInterpreterManagedProcess.java[processLine]:206) -  INFO [2017-07-06 
11:16:21,844] ({dispatcher-event-loop-5} Logging.scala[logInfo]:54) - Starting 
task 2.0 in stage 0.0 (TID 2, localhost, executor driver, partition 2, 
PROCESS_LOCAL, 6094 bytes)
DEBUG [2017-07-06 11:16:21,847] ({Exec Stream Pumper} 
RemoteInterpreterManagedProcess.java[processLine]:206) -  INFO [2017-07-06 
11:16:21,846] ({dispatcher-event-loop-5} Logging.scala[logInfo]:54) - Starting 
task 3.0 in stage 0.0 (TID 3, localhost, executor driver, partition 3, 
PROCESS_LOCAL, 6098 bytes)
DEBUG [2017-07-06 11:16:21,849] ({Exec Stream Pumper} 
RemoteInterpreterManagedProcess.java[processLine]:206) -  INFO [2017-07-06 
11:16:21,848] ({dispatcher-event-loop-5} Logging.scala[logInfo]:54) - Starting 
task 4.0 in stage 0.0 (TID 4, localhost, executor driver, partition 4, 
PROCESS_LOCAL, 6094 bytes)
DEBUG [2017-07-06 11:16:21,851] ({Exec Stream Pumper} 
RemoteInterpreterManagedProcess.java[processLine]:206) -  INFO [2017-07-06 
11:16:21,850] ({dispatcher-event-loop-5} Logging.scala[logInfo]:54) - Starting 
task 5.0 in stage 0.0 (TID 5, localhost, executor driver, partition 5, 
PROCESS_LOCAL, 6098 bytes)
DEBUG [2017-07-06 11:16:21,853] ({Exec Stream Pumper} 
RemoteInterpreterManagedProcess.java[processLine]:206) -  INFO [2017-07-06 
11:16:21,852] ({dispatcher-event-loop-5} Logging.scala[logInfo]:54) - Starting 
task 6.0 in stage 0.0 (TID 6, localhost, executor driver, partition 6, 
PROCESS_LOCAL, 6094 bytes)
DEBUG [2017-07-06 11:16:21,855] ({Exec Stream Pumper} 
RemoteInterpreterManagedProcess.java[processLine]:206) -  INFO [2017-07-06 
11:16:21,855] ({dispatcher-event-loop-5} Logging.scala[logInfo]:54) - Starting 
task 7.0 in stage 0.0 (TID 7, localhost, executor driver, partition 7, 
PROCESS_LOCAL, 6098 bytes)
DEBUG [2017-07-06 11:16:21,861] ({Exec Stream Pumper} 
RemoteInterpreterManagedProcess.java[processLine]:206) -  INFO [2017-07-06 
11:16:21,861] ({Executor task launch worker-7} Logging.scala[logInfo]:54) - 
Running task 7.0 in stage 0.0 (TID 7)
DEBUG [2017-07-06 11:16:21,861] ({Exec Stream Pumper} 
RemoteInterpreterManagedProcess.java[processLine]:206) -  INFO [2017-07-06 
11:16:21,861] ({Executor task launch worker-6} Logging.scala[logInfo]:54) - 
Running task 6.0 in stage 0.0 (TID 6)
DEBUG [2017-07-06 11:16:21,862] ({Exec Stream Pumper} 
RemoteInterpreterManagedProcess.java[processLine]:206) -  INFO [2017-07-06 
11:16:21,861] ({Executor task launch worker-5} Logging.scala[logInfo]:54) - 
Running task 5.0 in stage 0.0 (TID 5)
DEBUG [2017-07-06 11:16:21,863] ({Exec Stream Pumper} 
RemoteInterpreterManagedProcess.java[processLine]:206) -  INFO [2017-07-06 
11:16:21,861] ({Executor task launch worker-4} Logging.scala[logInfo]:54) - 
Running task 4.0 in stage 0.0 (TID 4)
DEBUG [2017-07-06 11:16:21,864] ({Exec Stream Pumper} 
RemoteInterpreterManagedProcess.java[processLine]:206) -  INFO [2017-07-06 
11:16:21,861] ({Executor task launch worker-3} Logging.scala[logInfo]:54) - 
Running task 3.0 in stage 0.0 (TID 3)
DEBUG [2017-07-06 11:16:21,864] ({Exec Stream Pumper} 
RemoteInterpreterManagedProcess.java[processLine]:206) -  INFO [2017-07-06 
11:16:21,861] ({Executor task launch worker-0} Logging.scala[logInfo]:54) - 
Running task 0.0 in stage 0.0 (TID 0)
DEBUG [2017-07-06 11:16:21,865] ({Exec Stream Pumper} 
RemoteInterpreterManagedProcess.java[processLine]:206) -  INFO [2017-07-06 
11:16:21,861] ({Executor task launch worker-1} Logging.scala[logInfo]:54) - 
Running task 1.0 in stage 0.0 (TID 1)
DEBUG [2017-07-06 11:16:21,865] ({Exec Stream Pumper} 
RemoteInterpreterManagedProcess.java[processLine]:206) -  INFO [2017-07-06 
11:16:21,861] ({Executor task launch worker-2} Logging.scala[logInfo]:54) - 
Running task 2.0 in stage 0.0 (TID 2)
DEBUG [2017-07-06 11:16:21,868] ({Exec Stream Pumper} 
RemoteInterpreterManagedProcess.java[processLine]:206) -  INFO [2017-07-06 
11:16:21,867] ({Executor task launch worker-1} Logging.scala[logInfo]:54) - 
Fetching file:/C:/zeppelin-0.7.2-bin-all/interpreter/spark/pyspark/pyspark.zip 
with timestamp 1499336173165
DEBUG [2017-07-06 11:16:21,920] ({Exec Stream Pumper} 
RemoteInterpreterManagedProcess.java[processLine]:206) -  INFO [2017-07-06 
11:16:21,919] ({Executor task launch worker-1} Logging.scala[logInfo]:54) - 
C:\zeppelin-0.7.2-bin-all\interpreter\spark\pyspark\pyspark.zip has been 
previously copied to 
C:\Users\Nassir\AppData\Local\Temp\spark-68921710-cfa4-4b92-86fa-417befe702de\userFiles-449fd92c-b7b1-4bbe-9da9-2024aa1d074f\pyspark.zip
DEBUG [2017-07-06 11:16:21,961] ({Exec Stream Pumper} 
RemoteInterpreterManagedProcess.java[processLine]:206) -  INFO [2017-07-06 
11:16:21,961] ({Executor task launch worker-1} Logging.scala[logInfo]:54) - 
Fetching 
file:/C:/zeppelin-0.7.2-bin-all/interpreter/spark/pyspark/py4j-0.10.4-src.zip 
with timestamp 1499336173217
DEBUG [2017-07-06 11:16:21,968] ({Exec Stream Pumper} 
RemoteInterpreterManagedProcess.java[processLine]:206) -  INFO [2017-07-06 
11:16:21,967] ({Executor task launch worker-1} Logging.scala[logInfo]:54) - 
C:\zeppelin-0.7.2-bin-all\interpreter\spark\pyspark\py4j-0.10.4-src.zip has 
been previously copied to 
C:\Users\Nassir\AppData\Local\Temp\spark-68921710-cfa4-4b92-86fa-417befe702de\userFiles-449fd92c-b7b1-4bbe-9da9-2024aa1d074f\py4j-0.10.4-src.zip
DEBUG [2017-07-06 11:16:22,074] ({Exec Stream Pumper} 
RemoteInterpreterManagedProcess.java[processLine]:206) -  INFO [2017-07-06 
11:16:22,074] ({Executor task launch worker-5} Logging.scala[logInfo]:54) - 
Finished task 5.0 in stage 0.0 (TID 5). 971 bytes result sent to driver
DEBUG [2017-07-06 11:16:22,074] ({Exec Stream Pumper} 
RemoteInterpreterManagedProcess.java[processLine]:206) -  INFO [2017-07-06 
11:16:22,074] ({Executor task launch worker-4} Logging.scala[logInfo]:54) - 
Finished task 4.0 in stage 0.0 (TID 4). 960 bytes result sent to driver
DEBUG [2017-07-06 11:16:22,077] ({Exec Stream Pumper} 
RemoteInterpreterManagedProcess.java[processLine]:206) -  INFO [2017-07-06 
11:16:22,074] ({Executor task launch worker-7} Logging.scala[logInfo]:54) - 
Finished task 7.0 in stage 0.0 (TID 7). 971 bytes result sent to driver
DEBUG [2017-07-06 11:16:22,078] ({Exec Stream Pumper} 
RemoteInterpreterManagedProcess.java[processLine]:206) -  INFO [2017-07-06 
11:16:22,074] ({Executor task launch worker-0} Logging.scala[logInfo]:54) - 
Finished task 0.0 in stage 0.0 (TID 0). 960 bytes result sent to driver
DEBUG [2017-07-06 11:16:22,078] ({Exec Stream Pumper} 
RemoteInterpreterManagedProcess.java[processLine]:206) -  INFO [2017-07-06 
11:16:22,074] ({Executor task launch worker-3} Logging.scala[logInfo]:54) - 
Finished task 3.0 in stage 0.0 (TID 3). 960 bytes result sent to driver
DEBUG [2017-07-06 11:16:22,079] ({Exec Stream Pumper} 
RemoteInterpreterManagedProcess.java[processLine]:206) -  INFO [2017-07-06 
11:16:22,074] ({Executor task launch worker-6} Logging.scala[logInfo]:54) - 
Finished task 6.0 in stage 0.0 (TID 6). 881 bytes result sent to driver
DEBUG [2017-07-06 11:16:22,079] ({Exec Stream Pumper} 
RemoteInterpreterManagedProcess.java[processLine]:206) -  INFO [2017-07-06 
11:16:22,074] ({Executor task launch worker-2} Logging.scala[logInfo]:54) - 
Finished task 2.0 in stage 0.0 (TID 2). 971 bytes result sent to driver
DEBUG [2017-07-06 11:16:22,080] ({Exec Stream Pumper} 
RemoteInterpreterManagedProcess.java[processLine]:206) -  INFO [2017-07-06 
11:16:22,074] ({Executor task launch worker-1} Logging.scala[logInfo]:54) - 
Finished task 1.0 in stage 0.0 (TID 1). 1050 bytes result sent to driver
DEBUG [2017-07-06 11:16:22,083] ({Exec Stream Pumper} 
RemoteInterpreterManagedProcess.java[processLine]:206) -  INFO [2017-07-06 
11:16:22,082] ({task-result-getter-1} Logging.scala[logInfo]:54) - Finished 
task 4.0 in stage 0.0 (TID 4) in 234 ms on localhost (executor driver) (1/8)
DEBUG [2017-07-06 11:16:22,083] ({Exec Stream Pumper} 
RemoteInterpreterManagedProcess.java[processLine]:206) -  INFO [2017-07-06 
11:16:22,083] ({task-result-getter-3} Logging.scala[logInfo]:54) - Finished 
task 0.0 in stage 0.0 (TID 0) in 285 ms on localhost (executor driver) (2/8)
DEBUG [2017-07-06 11:16:22,084] ({Exec Stream Pumper} 
RemoteInterpreterManagedProcess.java[processLine]:206) -  INFO [2017-07-06 
11:16:22,083] ({task-result-getter-0} Logging.scala[logInfo]:54) - Finished 
task 5.0 in stage 0.0 (TID 5) in 234 ms on localhost (executor driver) (3/8)
DEBUG [2017-07-06 11:16:22,085] ({Exec Stream Pumper} 
RemoteInterpreterManagedProcess.java[processLine]:206) -  INFO [2017-07-06 
11:16:22,084] ({task-result-getter-3} Logging.scala[logInfo]:54) - Finished 
task 6.0 in stage 0.0 (TID 6) in 233 ms on localhost (executor driver) (4/8)
DEBUG [2017-07-06 11:16:22,091] ({Exec Stream Pumper} 
RemoteInterpreterManagedProcess.java[processLine]:206) -  INFO [2017-07-06 
11:16:22,084] ({task-result-getter-2} Logging.scala[logInfo]:54) - Finished 
task 7.0 in stage 0.0 (TID 7) in 231 ms on localhost (executor driver) (5/8)
DEBUG [2017-07-06 11:16:22,092] ({Exec Stream Pumper} 
RemoteInterpreterManagedProcess.java[processLine]:206) -  INFO [2017-07-06 
11:16:22,085] ({task-result-getter-0} Logging.scala[logInfo]:54) - Finished 
task 2.0 in stage 0.0 (TID 2) in 244 ms on localhost (executor driver) (6/8)
DEBUG [2017-07-06 11:16:22,092] ({Exec Stream Pumper} 
RemoteInterpreterManagedProcess.java[processLine]:206) -  INFO [2017-07-06 
11:16:22,085] ({task-result-getter-3} Logging.scala[logInfo]:54) - Finished 
task 1.0 in stage 0.0 (TID 1) in 246 ms on localhost (executor driver) (7/8)
DEBUG [2017-07-06 11:16:22,093] ({Exec Stream Pumper} 
RemoteInterpreterManagedProcess.java[processLine]:206) -  INFO [2017-07-06 
11:16:22,086] ({task-result-getter-1} Logging.scala[logInfo]:54) - Finished 
task 3.0 in stage 0.0 (TID 3) in 242 ms on localhost (executor driver) (8/8)
DEBUG [2017-07-06 11:16:22,094] ({Exec Stream Pumper} 
RemoteInterpreterManagedProcess.java[processLine]:206) -  INFO [2017-07-06 
11:16:22,089] ({task-result-getter-1} Logging.scala[logInfo]:54) - Removed 
TaskSet 0.0, whose tasks have all completed, from pool default
DEBUG [2017-07-06 11:16:22,095] ({Exec Stream Pumper} 
RemoteInterpreterManagedProcess.java[processLine]:206) -  INFO [2017-07-06 
11:16:22,091] ({dag-scheduler-event-loop} Logging.scala[logInfo]:54) - 
ResultStage 0 (count at <console>:30) finished in 0.301 s
DEBUG [2017-07-06 11:16:22,098] ({Exec Stream Pumper} 
RemoteInterpreterManagedProcess.java[processLine]:206) -  INFO [2017-07-06 
11:16:22,097] ({pool-2-thread-4} Logging.scala[logInfo]:54) - Job 0 finished: 
count at <console>:30, took 0.506634 s
DEBUG [2017-07-06 11:16:22,099] ({Exec Stream Pumper} 
RemoteInterpreterManagedProcess.java[processLine]:206) - DEBUG [2017-07-06 
11:16:22,099] ({pool-2-thread-4} RemoteInterpreterServer.java[onAppend]:620) - 
Output Append:
DEBUG [2017-07-06 11:16:22,099] ({Exec Stream Pumper} 
RemoteInterpreterManagedProcess.java[processLine]:206) -
DEBUG [2017-07-06 11:16:22,100] ({Thread-35} 
RemoteInterpreterEventPoller.java[run]:243) - Event from remote process 
OUTPUT_APPEND
 INFO [2017-07-06 11:16:22,102] ({pool-2-thread-3} 
NotebookServer.java[afterStatusChange]:2056) - Job 20170705-230127_1960162842 
is finished successfully, status: FINISHED
DEBUG [2017-07-06 11:16:22,100] ({pool-4-thread-1} 
AppendOutputRunner.java[run]:91) - Processing time for append-output took 0 
milliseconds
DEBUG [2017-07-06 11:16:22,100] ({Exec Stream Pumper} 
RemoteInterpreterManagedProcess.java[processLine]:206) - DEBUG [2017-07-06 
11:16:22,099] ({pool-2-thread-4} RemoteInterpreterServer.java[onAppend]:620) - 
Output Append: res0: Long = 4
DEBUG [2017-07-06 11:16:22,106] ({pool-4-thread-1} 
AppendOutputRunner.java[run]:107) - Processing size for append-output is 1 
characters
DEBUG [2017-07-06 11:16:22,105] ({Thread-35} 
RemoteInterpreterEventPoller.java[run]:243) - Event from remote process 
OUTPUT_APPEND
DEBUG [2017-07-06 11:16:22,106] ({Exec Stream Pumper} 
RemoteInterpreterManagedProcess.java[processLine]:206) -
DEBUG [2017-07-06 11:16:22,108] ({Exec Stream Pumper} 
RemoteInterpreterManagedProcess.java[processLine]:206) - DEBUG [2017-07-06 
11:16:22,099] ({pool-1-thread-4} 
RemoteInterpreterEventClient.java[pollEvent]:229) - Send event OUTPUT_APPEND
DEBUG [2017-07-06 11:16:22,109] ({Exec Stream Pumper} 
RemoteInterpreterManagedProcess.java[processLine]:206) -  INFO [2017-07-06 
11:16:22,101] ({pool-2-thread-4} SchedulerFactory.java[jobFinished]:137) - Job 
remoteInterpretJob_1499336172196 finished by scheduler 
org.apache.zeppelin.spark.SparkInterpreter677854018
DEBUG [2017-07-06 11:16:22,109] ({Exec Stream Pumper} 
RemoteInterpreterManagedProcess.java[processLine]:206) - DEBUG [2017-07-06 
11:16:22,102] ({pool-1-thread-3} Interpreter.java[getProperty]:165) - key: 
zeppelin.spark.concurrentSQL, value: false
DEBUG [2017-07-06 11:16:22,110] ({Exec Stream Pumper} 
RemoteInterpreterManagedProcess.java[processLine]:206) - DEBUG [2017-07-06 
11:16:22,102] ({pool-1-thread-3} Interpreter.java[getProperty]:165) - key: 
zeppelin.spark.concurrentSQL, value: false
DEBUG [2017-07-06 11:16:22,110] ({Exec Stream Pumper} 
RemoteInterpreterManagedProcess.java[processLine]:206) - DEBUG [2017-07-06 
11:16:22,105] ({pool-1-thread-4} 
RemoteInterpreterEventClient.java[pollEvent]:229) - Send event OUTPUT_APPEND
 INFO [2017-07-06 11:16:22,143] ({pool-2-thread-3} 
SchedulerFactory.java[jobFinished]:137) - Job paragraph_1499292087419_639890335 
finished by scheduler 
org.apache.zeppelin.interpreter.remote.RemoteInterpretershared_session1173488545
DEBUG [2017-07-06 11:16:22,149] ({qtp1566723494-20} 
InterpreterSettingManager.java[getInterpreterSessionKey]:831) - Interpreter 
session key: shared_session, for note: 2CNACUTPT, user: anonymous, 
InterpreterSetting Name: spark
DEBUG [2017-07-06 11:16:22,149] ({Exec Stream Pumper} 
RemoteInterpreterManagedProcess.java[processLine]:206) - DEBUG [2017-07-06 
11:16:22,149] ({pool-1-thread-3} 
RemoteInterpreterServer.java[resourcePoolGetAll]:919) - Request getAll from 
ZeppelinServer
DEBUG [2017-07-06 11:16:22,207] ({pool-4-thread-1} 
AppendOutputRunner.java[run]:91) - Processing time for append-output took 0 
milliseconds
DEBUG [2017-07-06 11:16:22,207] ({pool-4-thread-1} 
AppendOutputRunner.java[run]:107) - Processing size for append-output is 15 
characters
DEBUG [2017-07-06 11:16:22,215] ({Thread-41} 
InterpreterSettingManager.java[getInterpreterSessionKey]:831) - Interpreter 
session key: shared_session, for note: 2CNACUTPT, user: anonymous, 
InterpreterSetting Name: spark
DEBUG [2017-07-06 11:16:22,717] ({Thread-41} 
InterpreterSettingManager.java[getInterpreterSessionKey]:831) - Interpreter 
session key: shared_session, for note: 2CNACUTPT, user: anonymous, 
InterpreterSetting Name: spark
DEBUG [2017-07-06 11:16:23,219] ({Thread-41} 
InterpreterSettingManager.java[getInterpreterSessionKey]:831) - Interpreter 
session key: shared_session, for note: 2CNACUTPT, user: anonymous, 
InterpreterSetting Name: spark
DEBUG [2017-07-06 11:16:23,721] ({Thread-41} 
InterpreterSettingManager.java[getInterpreterSessionKey]:831) - Interpreter 
session key: shared_session, for note: 2CNACUTPT, user: anonymous, 
InterpreterSetting Name: spark
DEBUG [2017-07-06 11:16:24,224] ({Thread-41} 
InterpreterSettingManager.java[getInterpreterSessionKey]:831) - Interpreter 
session key: shared_session, for note: 2CNACUTPT, user: anonymous, 
InterpreterSetting Name: spark
DEBUG [2017-07-06 11:16:24,726] ({Thread-41} 
InterpreterSettingManager.java[getInterpreterSessionKey]:831) - Interpreter 
session key: shared_session, for note: 2CNACUTPT, user: anonymous, 
InterpreterSetting Name: spark
DEBUG [2017-07-06 11:16:25,230] ({Thread-41} 
InterpreterSettingManager.java[getInterpreterSessionKey]:831) - Interpreter 
session key: shared_session, for note: 2CNACUTPT, user: anonymous, 
InterpreterSetting Name: spark
DEBUG [2017-07-06 11:16:25,733] ({Thread-41} 
InterpreterSettingManager.java[getInterpreterSessionKey]:831) - Interpreter 
session key: shared_session, for note: 2CNACUTPT, user: anonymous, 
InterpreterSetting Name: spark
DEBUG [2017-07-06 11:16:26,236] ({Thread-41} 
InterpreterSettingManager.java[getInterpreterSessionKey]:831) - Interpreter 
session key: shared_session, for note: 2CNACUTPT, user: anonymous, 
InterpreterSetting Name: spark
DEBUG [2017-07-06 11:16:26,740] ({Thread-41} 
InterpreterSettingManager.java[getInterpreterSessionKey]:831) - Interpreter 
session key: shared_session, for note: 2CNACUTPT, user: anonymous, 
InterpreterSetting Name: spark
DEBUG [2017-07-06 11:16:27,242] ({Thread-41} 
InterpreterSettingManager.java[getInterpreterSessionKey]:831) - Interpreter 
session key: shared_session, for note: 2CNACUTPT, user: anonymous, 
InterpreterSetting Name: spark
DEBUG [2017-07-06 11:16:27,744] ({Thread-41} 
InterpreterSettingManager.java[getInterpreterSessionKey]:831) - Interpreter 
session key: shared_session, for note: 2CNACUTPT, user: anonymous, 
InterpreterSetting Name: spark
DEBUG [2017-07-06 11:16:28,246] ({Thread-41} 
InterpreterSettingManager.java[getInterpreterSessionKey]:831) - Interpreter 
session key: shared_session, for note: 2CNACUTPT, user: anonymous, 
InterpreterSetting Name: spark
DEBUG [2017-07-06 11:16:28,749] ({Thread-41} 
InterpreterSettingManager.java[getInterpreterSessionKey]:831) - Interpreter 
session key: shared_session, for note: 2CNACUTPT, user: anonymous, 
InterpreterSetting Name: spark
DEBUG [2017-07-06 11:16:29,251] ({Thread-41} 
InterpreterSettingManager.java[getInterpreterSessionKey]:831) - Interpreter 
session key: shared_session, for note: 2CNACUTPT, user: anonymous, 
InterpreterSetting Name: spark
DEBUG [2017-07-06 11:16:29,753] ({Thread-41} 
InterpreterSettingManager.java[getInterpreterSessionKey]:831) - Interpreter 
session key: shared_session, for note: 2CNACUTPT, user: anonymous, 
InterpreterSetting Name: spark
DEBUG [2017-07-06 11:16:30,255] ({Thread-41} 
InterpreterSettingManager.java[getInterpreterSessionKey]:831) - Interpreter 
session key: shared_session, for note: 2CNACUTPT, user: anonymous, 
InterpreterSetting Name: spark
DEBUG [2017-07-06 11:16:30,257] ({Exec Stream Pumper} 
RemoteInterpreterManagedProcess.java[processLine]:206) -  INFO [2017-07-06 
11:16:30,257] ({pool-2-thread-6} SchedulerFactory.java[jobFinished]:137) - Job 
remoteInterpretJob_1499336180203 finished by scheduler interpreter_1164316755
 WARN [2017-07-06 11:16:30,260] ({pool-2-thread-2} 
NotebookServer.java[afterStatusChange]:2058) - Job 20170705-230150_1061129835 
is finished, status: ERROR, exception: null, result: %text












Traceback (most recent call last):
  File 
"C:\Users\Nassir\AppData\Local\Temp\zeppelin_pyspark-8455639148956336705.py", 
line 22, in <module>
    from pyspark.conf import SparkConf
  File "<frozen importlib._bootstrap>", line 961, in _find_and_load
  File "<frozen importlib._bootstrap>", line 950, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 646, in _load_unlocked
  File "<frozen importlib._bootstrap>", line 616, in _load_backward_compatible
  File 
"C:\zeppelin-0.7.2-bin-all\interpreter\spark\pyspark\pyspark.zip\pyspark\__init__.py",
 line 44, in <module>
  File "<frozen importlib._bootstrap>", line 961, in _find_and_load
  File "<frozen importlib._bootstrap>", line 950, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 646, in _load_unlocked
  File "<frozen importlib._bootstrap>", line 616, in _load_backward_compatible
  File 
"C:\zeppelin-0.7.2-bin-all\interpreter\spark\pyspark\pyspark.zip\pyspark\context.py",
 line 40, in <module>
  File "<frozen im











portlib._bootstrap>", line 961, in _find_and_load
  File "<frozen importlib._bootstrap>", line 950, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 646, in _load_unlocked
  File "<frozen importlib._bootstrap>", line 616, in _load_backward_compatible
  File 
"C:\zeppelin-0.7.2-bin-all\interpreter\spark\pyspark\pyspark.zip\pyspark\rdd.py",
 line 47, in <module>
  File "<frozen importlib._bootstrap>", line 961, in _find_and_load
  File "<frozen importlib._bootstrap>", line 950, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 646, in _load_unlocked
  File "<frozen importlib._bootstrap>", line 616, in _load_backward_compatible
  File 
"C:\zeppelin-0.7.2-bin-all\interpreter\spark\pyspark\pyspark.zip\pyspark\statcounter.py",
 line 24, in <module>
  File "C:\Users\Nassir\Anaconda2\lib\site-packages\numpy\__init__.py", line 
142, in <module>
    from . import add_newdocs
  File "C:\Users\Nassir\Anaconda2\lib\site-packages\numpy\add_newdocs.py", line 
13, in <module















>
    from numpy.lib import add_newdoc
  File "C:\Users\Nassir\Anaconda2\lib\site-packages\numpy\lib\__init__.py", 
line 8, in <module>
    from .type_check import *
  File "C:\Users\Nassir\Anaconda2\lib\site-packages\numpy\lib\type_check.py", 
line 11, in <module>
    import numpy.core.numeric as _nx
  File "C:\Users\Nassir\Anaconda2\lib\site-packages\numpy\core\__init__.py", 
line 72, in <module>
    from numpy.testing.nosetester import _numpy_tester
  File "C:\Users\Nassir\Anaconda2\lib\site-packages\numpy\testing\__init__.py", 
line 10, in <module>
    from unittest import TestCase
  File "C:\Users\Nassir\Anaconda2\lib\unittest\__init__.py", line 58, in 
<module>
    from .result import TestResult
  File "C:\Users\Nassir\Anaconda2\lib\unittest\result.py", line 7, in <module>
    from . import util
  File "C:\Users\Nassir\Anaconda2\lib\unittest\util.py", line 119, in <module>
    _Mismatch = namedtuple('Mismatch', 'actual expected value')
  File "C:\zeppelin-0.7.2-bin-all\interpreter\spark\pyspa

rk\pyspark.zip\pyspark\serializers.py", line 393, in namedtuple
TypeError: namedtuple() missing 3 required keyword-only arguments: 'verbose', 
'rename', and 'module'

%text












Traceback (most recent call last):
  File 
"C:\Users\Nassir\AppData\Local\Temp\zeppelin_pyspark-8455639148956336705.py", 
line 22, in <module>
    from pyspark.conf import SparkConf
  File "<frozen importlib._bootstrap>", line 961, in _find_and_load
  File "<frozen importlib._bootstrap>", line 950, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 646, in _load_unlocked
  File "<frozen importlib._bootstrap>", line 616, in _load_backward_compatible
  File 
"C:\zeppelin-0.7.2-bin-all\interpreter\spark\pyspark\pyspark.zip\pyspark\__init__.py",
 line 44, in <module>
  File "<frozen importlib._bootstrap>", line 961, in _find_and_load
  File "<frozen importlib._bootstrap>", line 950, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 646, in _load_unlocked
  File "<frozen importlib._bootstrap>", line 616, in _load_backward_compatible
  File 
"C:\zeppelin-0.7.2-bin-all\interpreter\spark\pyspark\pyspark.zip\pyspark\context.py",
 line 40, in <module>
  File "<frozen im











portlib._bootstrap>", line 961, in _find_and_load
  File "<frozen importlib._bootstrap>", line 950, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 646, in _load_unlocked
  File "<frozen importlib._bootstrap>", line 616, in _load_backward_compatible
  File 
"C:\zeppelin-0.7.2-bin-all\interpreter\spark\pyspark\pyspark.zip\pyspark\rdd.py",
 line 47, in <module>
  File "<frozen importlib._bootstrap>", line 961, in _find_and_load
  File "<frozen importlib._bootstrap>", line 950, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 646, in _load_unlocked
  File "<frozen importlib._bootstrap>", line 616, in _load_backward_compatible
  File 
"C:\zeppelin-0.7.2-bin-all\interpreter\spark\pyspark\pyspark.zip\pyspark\statcounter.py",
 line 24, in <module>
  File "C:\Users\Nassir\Anaconda2\lib\site-packages\numpy\__init__.py", line 
142, in <module>
    from . import add_newdocs
  File "C:\Users\Nassir\Anaconda2\lib\site-packages\numpy\add_newdocs.py", line 
13, in <module















>
    from numpy.lib import add_newdoc
  File "C:\Users\Nassir\Anaconda2\lib\site-packages\numpy\lib\__init__.py", 
line 8, in <module>
    from .type_check import *
  File "C:\Users\Nassir\Anaconda2\lib\site-packages\numpy\lib\type_check.py", 
line 11, in <module>
    import numpy.core.numeric as _nx
  File "C:\Users\Nassir\Anaconda2\lib\site-packages\numpy\core\__init__.py", 
line 72, in <module>
    from numpy.testing.nosetester import _numpy_tester
  File "C:\Users\Nassir\Anaconda2\lib\site-packages\numpy\testing\__init__.py", 
line 10, in <module>
    from unittest import TestCase
  File "C:\Users\Nassir\Anaconda2\lib\unittest\__init__.py", line 58, in 
<module>
    from .result import TestResult
  File "C:\Users\Nassir\Anaconda2\lib\unittest\result.py", line 7, in <module>
    from . import util
  File "C:\Users\Nassir\Anaconda2\lib\unittest\util.py", line 119, in <module>
    _Mismatch = namedtuple('Mismatch', 'actual expected value')
  File "C:\zeppelin-0.7.2-bin-all\interpreter\spark\pyspa

rk\pyspark.zip\pyspark\serializers.py", line 393, in namedtuple
TypeError: namedtuple() missing 3 required keyword-only arguments: 'verbose', 
'rename', and 'module'

%text pyspark is not responding
DEBUG [2017-07-06 11:16:30,264] ({Exec Stream Pumper} 
RemoteInterpreterManagedProcess.java[processLine]:206) - DEBUG [2017-07-06 
11:16:30,259] ({pool-1-thread-2} Interpreter.java[getProperty]:165) - key: 
zeppelin.spark.concurrentSQL, value: false
DEBUG [2017-07-06 11:16:30,267] ({Exec Stream Pumper} 
RemoteInterpreterManagedProcess.java[processLine]:206) - DEBUG [2017-07-06 
11:16:30,259] ({pool-1-thread-2} Interpreter.java[getProperty]:165) - key: 
zeppelin.spark.concurrentSQL, value: false
 INFO [2017-07-06 11:16:30,328] ({pool-2-thread-2} 
SchedulerFactory.java[jobFinished]:137) - Job 
paragraph_1499292110304_-1979248670 finished by scheduler 
org.apache.zeppelin.interpreter.remote.RemoteInterpretershared_session1173488545
DEBUG [2017-07-06 11:16:30,337] ({qtp1566723494-15} 
InterpreterSettingManager.java[getInterpreterSessionKey]:831) - Interpreter 
session key: shared_session, for note: 2CNACUTPT, user: anonymous, 
InterpreterSetting Name: spark
DEBUG [2017-07-06 11:16:30,337] ({Exec Stream Pumper} 
RemoteInterpreterManagedProcess.java[processLine]:206) - DEBUG [2017-07-06 
11:16:30,337] ({pool-1-thread-2} 
RemoteInterpreterServer.java[resourcePoolGetAll]:919) - Request getAll from 
ZeppelinServer
DEBUG [2017-07-06 11:16:30,350] ({qtp1566723494-58} 
InterpreterSettingManager.java[getInterpreterSessionKey]:831) - Interpreter 
session key: shared_session, for note: 2CNACUTPT, user: anonymous, 
InterpreterSetting Name: spark
DEBUG [2017-07-06 11:16:30,351] ({Exec Stream Pumper} 
RemoteInterpreterManagedProcess.java[processLine]:206) - DEBUG [2017-07-06 
11:16:30,351] ({pool-1-thread-2} 
RemoteInterpreterServer.java[resourcePoolGetAll]:919) - Request getAll from 
ZeppelinServer




--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

Reply via email to