The third issue may be related to this:
https://issues.apache.org/jira/browse/SPARK-2022
We can take a look at this during the bug fix period for the 1.1
release next week. If we come up with a fix we can backport it into
the 1.0 branch also.
On Wed, Jul 30, 2014 at 11:31 PM, Patrick Wendell wro
Thanks for digging around here. I think there are a few distinct issues.
1. Properties containing the '=' character need to be escaped.
I was able to load properties fine as long as I escape the '='
character. But maybe we should document this:
== spark-defaults.conf ==
spark.foo a\=B
== shell ==
Hi Folks,
Today I am trying to build spark using maven; however, the following
command failed consistently for both 1.0.1 and the latest master. (BTW, it
seems sbt works fine: *sbt/sbt -Dhadoop.version=2.4.0 -Pyarn clean
assembly)*
Environment: Mac OS Mavericks
Maven: 3.2.2 (installed by homebre
See Mailing list section of:
https://spark.apache.org/community.html
On Wed, Jul 30, 2014 at 6:53 PM, Grace wrote:
>
>
I found SPARK-2706
Let me attach tentative patch there - I still face compilation error.
Cheers
On Mon, Jul 28, 2014 at 5:59 PM, Ted Yu wrote:
> bq. Either way its unclear to if there is any reason to use reflection to
> support multiple versions, instead of just upgrading to Hive 0.13.0
>
>
I think I might find the root cause, YARN-1931 addressed the incompatible
issue. The solution for my case might be either take related Spark patches
or do an upgrade.
On Wed, Jul 30, 2014 at 2:11 PM, yao wrote:
> Hi Everyone,
>
> We got some yarn related errors when running spark 0.9.0 on hadoo
In addition, spark.executor.extraJavaOptions does not seem to behave as I
would expect; java arguments don't seem to be propagated to executors.
$ cat conf/spark-defaults.conf
spark.master
mesos://zk://etl-01.mxstg:2181,etl-02.mxstg:2181,etl-03.mxstg:2181/masters
spark.executor.extraJavaOptions
Either whitespace or equals sign are valid properties file formats.
Here's an example:
$ cat conf/spark-defaults.conf
spark.driver.extraJavaOptions -Dfoo.bar.baz=23
$ ./bin/spark-shell -v
Using properties file: /opt/spark/conf/spark-defaults.conf
Adding default property: spark.driver.extraJavaOpt
Hi Everyone,
We got some yarn related errors when running spark 0.9.0 on hadoop 2.4 (but
it was okay on hadoop 2.2). I didn't find any comments said spark 0.9.0
could support hadoop 2.4, so could I assume that we have to upgrade spark
to the latest release version at this point to solve this issue
Cody - in your example you are using the '=' character, but in our
documentation and tests we use a whitespace to separate the key and
value in the defaults file.
docs: http://spark.apache.org/docs/latest/configuration.html
spark.driver.extraJavaOptions -Dfoo.bar.baz=23
I'm not sure if the java
Hi Cody,
Could you file a bug for this if there isn't one already?
For system properties SparkSubmit should be able to read those
settings and do the right thing, but that obviously won't work for
other JVM options... the current code should work fine in cluster mode
though, since the driver is a
We were previously using SPARK_JAVA_OPTS to set java system properties via
-D.
This was used for properties that varied on a per-deployment-environment
basis, but needed to be available in the spark shell and workers.
On upgrading to 1.0, we saw that SPARK_JAVA_OPTS had been deprecated, and
repla
13 matches
Mail list logo