+1. Text logs are much more human-readable.
On Wed, Dec 11, 2024 at 12:12 PM Igor Dvorzhak
wrote:
> +1
>
>
> On Tue, Dec 10, 2024 at 7:48 PM Yang Jie wrote:
>
>> +1
>>
>> On 2024/12/11 02:34:02 Kent Yao wrote:
>> > +1
>> >
>> > On 2024/11/23 02:50:36 Wenchen Fan wrote:
>> > > Hi Martin,
>> > >
+1
On Tue, Dec 10, 2024 at 7:48 PM Yang Jie wrote:
> +1
>
> On 2024/12/11 02:34:02 Kent Yao wrote:
> > +1
> >
> > On 2024/11/23 02:50:36 Wenchen Fan wrote:
> > > Hi Martin,
> > >
> > > Yea, we should be more deliberate about when to use Structured
> Logging. Let
> > > me start with when people
+1
On 2024/12/11 02:34:02 Kent Yao wrote:
> +1
>
> On 2024/11/23 02:50:36 Wenchen Fan wrote:
> > Hi Martin,
> >
> > Yea, we should be more deliberate about when to use Structured Logging. Let
> > me start with when people prefer plain text logs:
> > - Spark engine developers like us. When runnin
+1
On 2024/11/23 02:50:36 Wenchen Fan wrote:
> Hi Martin,
>
> Yea, we should be more deliberate about when to use Structured Logging. Let
> me start with when people prefer plain text logs:
> - Spark engine developers like us. When running tests, the logs are printed
> in the console and plain te
Hi Martin,
Yea, we should be more deliberate about when to use Structured Logging. Let
me start with when people prefer plain text logs:
- Spark engine developers like us. When running tests, the logs are printed
in the console and plain text log is more human-readable.
- Spark users who prefer to
+1 to defaulting to text logs !
Regards,
Mridul
On Fri, Nov 22, 2024 at 6:21 PM Gengliang Wang wrote:
> Hi all,
>
> Earlier this year, we introduced JSON logging as the default in Spark with
> the aim of enhancing log structure and facilitating better analysis. While
> this change was made with
Hi all,
Earlier this year, we introduced JSON logging as the default in Spark with
the aim of enhancing log structure and facilitating better analysis. While
this change was made with the best intentions, we've collectively observed
some practical challenges that impact usability.
*Key Observatio
+1 for default using plain text logging. It is good for simple usage
scenario, will also be more friendly to first time Spark users.
And different companies may already build some tooling to process Spark
logs. Using plain text by default will make those exiting tools continue to
work.
On Friday
It doesn’t have to be very easy. It just has to be easier than maintaining two
infrastrictures forever.
If we can’t easily parse the json log to emmit the existing text content, I’d
say we have a bigger problem.
On Nov 22, 2024 at 2:17 PM -0800, Jungtaek Lim ,
wrote:
I'm not sure it is very eas
I'm not sure it is very easy to provide a reader (I meant, viewer); it
would be mostly not a reader but a post-processor which will convert JSON
formatted log to plain text log. And after that users would get the "same"
UI/UX when dealing with log files in Spark 3.x. For people who do not
really ne
Shouldn’t we differentiate between teh logging and the reading of the log.
The problem appears to be in the presentation layer.
We could provide a basic log reader, insteda of supporting longterm two
different ways to log.
On Nov 22, 2024, at 6:37 AM, Martin Grund wrote:
I'm generally supporti
I'm generally supportive of this direction. However, I'm wondering if we
can be more deliberate about when to use it. For example, for the common
scenarios that you mention as "light" usage, we should switch to plain text
logging.
IMO, this would cover the cases where a user runs simply the pyspar
Hi all,
I'm writing this email to propose switching back to the previous plain text
logs by default, for the following reasons:
- The JSON log is not very human-readable. It's more verbose than plain
text, and new lines become `\n`, making query plan tree string and error
stacktrace very
13 matches
Mail list logo