Hi all,
Recently, there was an issue about a leak in SparkR in
https://issues.apache.org/jira/browse/SPARK-21093.
It was even worse because R workers crash on CentOS easily. This was fixed
in
https://github.com/apache/spark/commit/6b3d02285ee0debc73cbcab01b10398a498fbeb8.
It was about the very co
impressive! I need to learn more about scala.
What I mean stripping away conditional check in Java is this.
static final boolean isLogInfoEnabled = false;
public void logMessage(String message) {
if(isLogInfoEnabled) {
log.info(message)
}
}
If you look at the byte code the dead
I think it's more precise to say args like any expression are evaluated
when their value is required. It's just that this special syntax causes
extra code to be generated that makes it effectively a function passed, not
value, and one that's lazily evaluated. Look at the bytecode if you're
curious.
@Sean Got it! I come from Java world so I guess I was wrong in assuming
that arguments are evaluated during the method invocation time. How about
the conditional checks to see if the log is InfoEnabled or DebugEnabled?
For Example,
if (log.isInfoEnabled) log.info(msg)
I hear we should use guard c
Does adding -X to mvn command give you more information ?
Cheers
On Sun, Jun 25, 2017 at 5:29 AM, 萝卜丝炒饭 <1427357...@qq.com> wrote:
> Hi all,
>
> Today I use new PC to compile SPARK.
> At the beginning, it worked well.
> But it stop at some point.
> the content in consle is :
> ==
Hi all,
Let me add more info about this.
The log showed:
17/06/25 17:31:26 DEBUG ReducedWindowedDStream: Time 1498383086000 ms is valid
17/06/25 17:31:26 DEBUG ReducedWindowedDStream: Window time = 2000 ms
17/06/25 17:31:26 DEBUG ReducedWindowedDStream: Slide time = 8000 ms
17/06/25 17:31:26 DEBU
Hi,
I have a custom spark kayo encoder but that one is not in scope for the
UDFs to work.
https://stackoverflow.com/questions/44735235/spark-custom-kryo-encoder-not-providing-schema-for-udf
Regards,
Georg
Hi all,
Today I use new PC to compile SPARK.
At the beginning, it worked well.
But it stop at some point.
the content in consle is :
[INFO]
[INFO] --- maven-jar-plugin:2.6:test-jar (prepare-test-jar) @ spark-parent_2.11
---
[INFO]
[INFO] --- maven-site-plugin:3.3:att
On Thu, Jun 22, 2017 at 7:51 PM, OBones wrote:
> Hello,
>
> I'm trying to extend Spark so that it can use our own binary format as a
> read-only source for pipeline based computations.
> I already have a java class that gives me enough elements to build a
> complete StructType with enough metadat
Maybe you are looking for declarations like this. "=> String" means the arg
isn't evaluated until it's used, which is just what you want with log
statements. The message isn't constructed unless it will be logged.
protected def logInfo(msg: => String) {
On Sun, Jun 25, 2017 at 10:28 AM kant kodal
I am not getting the question. The logging trait does exactly what is says
on the box, I don't see what string concatenation has to do with it.
On Sun, Jun 25, 2017 at 11:27 AM, kant kodali wrote:
> Hi All,
>
> I came across this file https://github.com/apache/spark/blob/master/core/
> src/main/
Hi All,
I came across this file
https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/internal/Logging.scala
and I am wondering what is the purpose of this? Especially it doesn't
prevent any string concatenation and also the if checks are already done by
the library itse
12 matches
Mail list logo