+1
Tested on Mac OS X
Burak
On Thu, Jun 4, 2015 at 6:35 PM, Calvin Jia wrote:
> +1
>
> Tested with input from Tachyon and persist off heap.
>
> On Thu, Jun 4, 2015 at 6:26 PM, Timothy Chen wrote:
>
>> +1
>>
>> Been testing cluster mode and client mode with mesos with 6 nodes cluster.
>>
>> Ev
Hi everyone,
Considering the python API as just a front needing the SPARK_HOME defined
anyway, I think it would be interesting to deploy the Python part of Spark
on PyPi in order to handle the dependencies in a Python project needing
PySpark via pip.
For now I just symlink the python/pyspark in my
Hi,
I added
in my pom.xml and the problem is solved.
+
false
Thank you @Steve and @Ted
Regards,
Meethu Mathew
Senior Engineer
Flytxt
On Thu, Jun 4, 2015 at 9:51 PM, Ted Yu wrote:
> Andrew Or put in this workaround :
>
> diff --git a/pom.xml b/pom.xml
> index 0b1aaad..
+1
Tested with input from Tachyon and persist off heap.
On Thu, Jun 4, 2015 at 6:26 PM, Timothy Chen wrote:
> +1
>
> Been testing cluster mode and client mode with mesos with 6 nodes cluster.
>
> Everything works so far.
>
> Tim
>
> On Jun 4, 2015, at 5:47 PM, Andrew Or wrote:
>
> +1 (binding)
+1
Been testing cluster mode and client mode with mesos with 6 nodes cluster.
Everything works so far.
Tim
> On Jun 4, 2015, at 5:47 PM, Andrew Or wrote:
>
> +1 (binding)
>
> Ran the same tests I did for RC3:
>
> Tested the standalone cluster mode REST submission gateway - submit / status
+1 (binding)
Ran the same tests I did for RC3:
Tested the standalone cluster mode REST submission gateway - submit /
status / kill
Tested simple applications on YARN client / cluster modes with and without
--jars
Tested python applications on YARN client / cluster modes with and without
--py-file
I saw something like this last night, with a similar message. Is this what
you’re referring to?
[error]
org.deeplearning4j#dl4j-spark-ml;0.0.3.3.4.alpha1-SNAPSHOT!dl4j-spark-ml.jar
origin location must be absolute:
file:/Users/eron/.m2/repository/org/deeplearning4j/dl4j-spark-ml/0.0.3.3.4.alp
+1
Tested on Mac OS X
> On Jun 4, 2015, at 1:09 PM, Patrick Wendell wrote:
>
> I will give +1 as well.
>
> On Wed, Jun 3, 2015 at 11:59 PM, Reynold Xin wrote:
>> Let me give you the 1st
>>
>> +1
>>
>>
>>
>> On Tue, Jun 2, 2015 at 10:47 PM, Patrick Wendell wrote:
>>>
>>> He all - a tiny
Here's one of the types of exceptions I get (this one when running
VersionsSuite from sql/hive):
[info] - 13: create client *** FAILED *** (1 second, 946 milliseconds)
[info] java.lang.RuntimeException: [download failed:
org.apache.httpcomponents#httpclient;4.2.5!httpclient.jar, download failed:
I will give +1 as well.
On Wed, Jun 3, 2015 at 11:59 PM, Reynold Xin wrote:
> Let me give you the 1st
>
> +1
>
>
>
> On Tue, Jun 2, 2015 at 10:47 PM, Patrick Wendell wrote:
>>
>> He all - a tiny nit from the last e-mail. The tag is v1.4.0-rc4. The
>> exact commit and all other information is cor
interesting... i definitely haven't seen it happen that often in our build
system, and when it has happened, i wasn't able to determine the cause.
On Thu, Jun 4, 2015 at 10:16 AM, Marcelo Vanzin wrote:
> On Thu, Jun 4, 2015 at 10:04 AM, shane knapp wrote:
>
>> this has occasionally happened on
On Thu, Jun 4, 2015 at 10:04 AM, shane knapp wrote:
> this has occasionally happened on our jenkins as well (twice since last
> august), and deleting the cache fixes it right up.
>
Yes deleting the cache fixes things, but it's kinda annoying to have to do
that. And yesterday when I was testing a
this has occasionally happened on our jenkins as well (twice since last
august), and deleting the cache fixes it right up.
On Thu, Jun 4, 2015 at 4:29 AM, Sean Owen wrote:
> I've definitely seen the "dependency path must be relative" problem,
> and fixed it by deleting the ivy cache, but I don't
They're my local builds, so I wouldn't be able to send you any links... and
the error is generally from sbt, not the unit tests. But if there's any
info I can collect when I see the error, let me know.
I'll try "spark.jars.ivy". I wonder if we should just set that to the
system properties in Spark
Hi Marcelo,
This is interesting. Can you please send me links to any failing builds if
you see that problem please. For now you can set a conf: `spark.jars.ivy`
to use a path except `~/.ivy2` for Spark.
Thanks,
Burak
On Thu, Jun 4, 2015 at 4:29 AM, Sean Owen wrote:
> I've definitely seen the "
Trying spark-dev mailing list to see if anyone knows.
-- Forwarded message --
From: Ashwin Shankar
Date: Wed, Jun 3, 2015 at 5:38 PM
Subject: How to pass system properties in spark ?
To: "u...@spark.apache.org"
Hi,
I'm trying to use property substitution in my log4j.properties,
Andrew Or put in this workaround :
diff --git a/pom.xml b/pom.xml
index 0b1aaad..d03d33b 100644
--- a/pom.xml
+++ b/pom.xml
@@ -1438,6 +1438,8 @@
2.3
false
+
+ false
FYI
On Thu, Jun 4, 2015 at 6:25 AM, Ste
https://cwiki.apache.org/confluence/display/SPARK/Contributing+to+Spark
... which contains ...
https://issues.apache.org/jira/browse/SPARK-7993?jql=project%20%3D%20SPARK%20AND%20labels%20%3D%20Starter%20AND%20status%20in%20(Open%2C%20%22In%20Progress%22%2C%20Reopened)
On Thu, Jun 4, 2015 at 5:45
I am new to Spark and would like to contribute to it. I recall seeing
somewhere on the website a link to a JIRA filter for new contributers,
but can't find that anymore. Could someone point me to it?
Thanks,
-ravi
-
To unsub
Hey everyone,
I’m looking to develop a package for use with SparkR. This package would
include custom R and Scala code and I was wondering if anyone had any insight
into how I might be able to use the sbt-spark-package tool to publish something
that needs to include an R package as well as a JA
On 4 Jun 2015, at 11:16, Meethu Mathew
mailto:meethu.mat...@flytxt.com>> wrote:
Hi all,
I added some new code to MLlib. When I am trying to build only the mllib
project using mvn --projects mllib/ -DskipTests clean install
after setting
export S
PARK_PREPEND_CLASSES=true
, the build is ge
I've definitely seen the "dependency path must be relative" problem,
and fixed it by deleting the ivy cache, but I don't know more than
this.
On Thu, Jun 4, 2015 at 1:33 AM, Marcelo Vanzin wrote:
> Hey all,
>
> I've been bit by something really weird lately and I'm starting to think
> it's relate
Hi all,
I added some new code to MLlib. When I am trying to build only the mllib
project using *mvn --projects mllib/ -DskipTests clean install*
* *after setting
export S
PARK_PREPEND_CLASSES=true
, the build is getting stuck with the following message.
> Excluding org.jpmml:pmml-schema:ja
Hi DB Tsai,
Not for now. My primary reference is
http://jmlr.csail.mit.edu/proceedings/papers/v15/wang11a/wang11a.pdf .
And I'm seeking a way to maximum code reuse. Any suggestion will be welcome.
Thanks.
Regards,
yuhao
-Original Message-
From: DB Tsai [mailto:dbt...@dbtsai.com]
Sent
Let me give you the 1st
+1
On Tue, Jun 2, 2015 at 10:47 PM, Patrick Wendell wrote:
> He all - a tiny nit from the last e-mail. The tag is v1.4.0-rc4. The
> exact commit and all other information is correct. (thanks Shivaram
> who pointed this out).
>
> On Tue, Jun 2, 2015 at 8:53 PM, Patrick
25 matches
Mail list logo