Doug
thanks for responding.
>>I think Spark just needs to be compiled against 1.2.1
Can you elaborate on this, or specific command you are referring ?
In our build.scala, I was including the following
"org.spark-project.hive" % "hive-exec" % "1.2.1.spark" intransitive()
I am not sur
Yes. The crazy thing about mesos running in fine grained mode is that there
is no way (correct me if I'm wrong) to set the number of cores per
executor. If one of my slaves on mesos has 32 cores, the fine grained mode
can allocate 32 cores on this executor for the job and if there are 32
tasks runn
All,
just to see if this happens to other as well.
This is tested against the
spark 1.5.1 ( branch 1.5 with label 1.5.2-SNAPSHOT with commit on Tue
Oct 6, 84f510c4fa06e43bd35e2dc8e1008d0590cbe266)
Spark deployment mode : Spark-Cluster
Notice that if we enable Kerberos mode, the
Is this still Mesos fine grained mode?
On Wed, Oct 21, 2015 at 1:16 PM, Jerry Lam wrote:
> Hi guys,
>
> There is another memory issue. Not sure if this is related to Tungsten
> this time because I have it disable (spark.sql.tungsten.enabled=false). It
> happens more there are too many tasks run
Hey Luciano,
This sounds like a reasonable plan to me. One of my colleagues has written
some Dockerized MySQL testing utilities, so I'll take a peek at those to
see if there are any specifics of their solution that we should adapt for
Spark.
On Wed, Oct 21, 2015 at 1:16 PM, Luciano Resende
wrote
Hi guys,
There is another memory issue. Not sure if this is related to Tungsten this
time because I have it disable (spark.sql.tungsten.enabled=false). It
happens more there are too many tasks running (300). I need to limit the
number of task to avoid this. The executor has 6G. Spark 1.5.1 is been
I have started looking into PR-8101 [1] and what is required to merge it
into trunk which will also unblock me around SPARK-10521 [2].
So here is the minimal plan I was thinking about :
- make the docker image version fixed so we make sure we are using the same
image all the time
- pull the requi
@Reynold submitted the PR: https://github.com/apache/spark/pull/9199
On Wed, Oct 21, 2015 at 11:01 PM, Shagun Sodhani
wrote:
> Sure! Would do that.
>
> Thanks a lot
>
> On Wed, Oct 21, 2015 at 10:59 PM, Reynold Xin wrote:
>
>> I think we made a mistake and forgot to register the function in the
Sure! Would do that.
Thanks a lot
On Wed, Oct 21, 2015 at 10:59 PM, Reynold Xin wrote:
> I think we made a mistake and forgot to register the function in the
> registry:
> https://github.com/apache/spark/blob/master/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/FunctionRegi
I think we made a mistake and forgot to register the function in the
registry:
https://github.com/apache/spark/blob/master/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/FunctionRegistry.scala
Do you mind submitting a pull request to fix this? Should be an one line
change. I fi
Apologies for reposting this to the dev list but I’ve had no luck in getting
information about spark.driver.cores on the user list.
Happy to create a PR with documentation improvements for the spark.driver.cores
config setting after I get some more details.
Thanks!
-adrian
From: Adrian Tanase
You're welcome to open a little pull request to fix that.
On Wed, Oct 21, 2015, 10:47 AM tyronecai wrote:
> In conf/spark-env.sh.template
> https://github.com/apache/spark/blob/master/conf/spark-env.sh.template#L42
> # - SPARK_DRIVER_MEMORY, Memory for Master (e.g. 1000M, 2G) (Default: 1G)
>
>
>
In conf/spark-env.sh.template
https://github.com/apache/spark/blob/master/conf/spark-env.sh.template#L42
# - SPARK_DRIVER_MEMORY, Memory for Master (e.g. 1000M, 2G) (Default: 1G)
SPARK_DRIVER_MEMORY is memory config for driver, not master.
Thanks!
--
Hi! I was trying out different arithmetic functions in SparkSql. I noticed
a weird thing. While *sinh* and *tanh* functions are working, using
*cosh* results
in an error saying:
*Exception in thread "main" org.apache.spark.sql.AnalysisException:
undefined function cosh;*
The documentation says *c
14 matches
Mail list logo