Why not use spark web ui to compare the performance ? It seems easier for
me.

Alessandro Liparoti <alessandro.l...@gmail.com>于2018年6月19日周二 下午4:56写道:

> I am comparing performances between different implementations of a spark
> job and I am testing a chunk of code which prints partial results and info
> to sdtout. I can surely replace all the prints with logger calls and
> collect them. I just wanted to know if there was a way to avoid this or if
> this functionality was of easier implementation.
>
> *Alessandro Liparoti*
>
> 2018-06-19 10:52 GMT+02:00 Jeff Zhang <zjf...@gmail.com>:
>
>>
>> Not sure what kind of analysis you want to do, is the logging info in the
>> interpreter log file enough for you ? (You can update the log level in
>> log4j.properties to get more logs)
>>
>> Alessandro Liparoti <alessandro.l...@gmail.com>于2018年6月19日周二 下午4:47写道:
>>
>>> I would like to post-analyze the output of verbose jobs in the notebook
>>> and save them, avoiding to relaunch the jobs again. It would be also good
>>> to have the stderr logged to file.
>>>
>>> Thanks
>>>
>>> *Alessandro Liparoti*
>>>
>>> 2018-06-19 10:43 GMT+02:00 Jeff Zhang <zjf...@gmail.com>:
>>>
>>>>
>>>> I am not afraid it is not possible now. The stdout of notebooks is not
>>>> based on log4j. If you want it output to file as well, you might need to
>>>> change the code of the interpreter itself.
>>>> Usually it is not necessary to log it to log file as well, could you
>>>> tell why you want that ? Thanks
>>>>
>>>>
>>>>
>>>> alessandro.l...@gmail.com <alessandro.l...@gmail.com>于2018年6月19日周二
>>>> 下午3:52写道:
>>>>
>>>>> Good morning,
>>>>> I would like to have stdout of notebooks both printed out to console
>>>>> and file. How can I achieve that? I tried to play around with log4j but
>>>>> without any success; it seems it requires a custom appender 
>>>>> implementation.
>>>>> Any other simpler idea?
>>>>>
>>>>
>>>
>

Reply via email to