+1 as well.
Regards
JB
On Aug 16, 2017, 10:12, at 10:12, "Ismaël Mejía" wrote:
>+1 (non-binding)
>
>This is something really great to have. More schedulers and runtime
>environments are a HUGE win for the Spark ecosystem.
>Amazing work, Big kudos for the guys who created and continue working
>on
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> Q: How can I help test this release?
>>>>>>>>>>> A: If you are a Spark user, you can
help us test this release by
>>>>>>>>>>> taking an existing Spark workload
and running on this release candidate,
>>>>>>>>>>> then reporting any regressions from
2.0.0.
>>>>>>>>>>>
>>>>>>>>>>> Q: What justifies a -1 vote for this
release?
>>>>>>>>>>> A: This is a maintenance release in
the 2.0.x series. Bugs
>>>>>>>>>>> already present in 2.0.0, missing
features, or bugs related to new features
>>>>>>>>>>> will not necessarily block this release.
>>>>>>>>>>>
>>>>>>>>>>> Q: What fix version should I use for
patches merging into
>>>>>>>>>>> branch-2.0 from now on?
>>>>>>>>>>> A: Please mark the fix version as
2.0.2, rather than 2.0.1. If a
>>>>>>>>>>> new RC (i.e. RC4) is cut, I will
change the fix version of those patches to
>>>>>>>>>>> 2.0.1.
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>>
>>>
>>> --
>>> Best Regards
>>>
>>> Jeff Zhang
>
>
-
To unsubscribe e-mail:
dev-unsubscr...@spark.apache.org
<mailto:dev-unsubscr...@spark.apache.org>
--
Sameer Agarwal
Software Engineer | Databricks Inc.
http://cs.berkeley.edu/~sameerag
<http://cs.berkeley.edu/~sameerag>
--
Maciek Bryński
--
Cell : 425-233-8271
Twitter: https://twitter.com/holdenkarau
<https://twitter.com/holdenkarau>
--
Jean-Baptiste Onofré
jbono...@apache.org
http://blog.nanthrax.net
Talend - http://www.talend.com
-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
Agreed.
Regards
JB
On Aug 18, 2016, 07:32, at 07:32, Olivier Girardot
wrote:
>CC'ing dev list, you should open a Jira and a PR related to it to
>discuss it c.f.
>https://cwiki.apache.org/confluence/display/SPARK/Contributing+to+Spark#ContributingtoSpark-ContributingCodeChanges
>
>
>
>
>
>On W
hanks,
--Prashant
--
Jean-Baptiste Onofré
jbono...@apache.org
http://blog.nanthrax.net
Talend - http://www.talend.com
-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org
subscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org
--
Jean-Baptiste Onofré
jbono...@apache.org
http://blog.nanthrax.net
Talend - http://www.talend.com
-
To unsubscribe, e-mail: dev-un
Hi Luciano,
I didn't mean Spark proper, but more something like you proposed.
Regards
JB
On 03/26/2016 06:38 PM, Luciano Resende wrote:
On Sat, Mar 26, 2016 at 10:20 AM, Jean-Baptiste Onofré mailto:j...@nanthrax.net>> wrote:
Hi Luciano,
If we take the "pure&q
>>>>> Marcelo
>>>>>
>>>>>
-
>>>>> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
<mailto:dev-unsubscr...@spark.apache.org>
>>>
uld
stay on
> scala 2.10. and finally with scala 2.12 around the corner you
really dont
> want to be supporting 3 versions. so clearly i am missing
something here.
>
>
>
> On Thu, Mar 24, 2016 at 8:52 AM, Jean-Baptiste Onofré
mailto:j
is not that big of a deal right now, but will
become increasingly more difficult as we optimize performance.
The downside of not supporting Java 7 is also obvious. Some
organizations are stuck with Java 7, and they wouldn't be able to use
Spark 2.0 without upgrading Java.
--
Jean-Bapti
OK, so kafka, kinesis and flume will stay in Spark.
Thanks,
Regards
JB
On 03/22/2016 08:30 AM, Reynold Xin wrote:
Kinesis is still in it. I think it's OK to add Flume back.
On Tue, Mar 22, 2016 at 12:29 AM, Jean-Baptiste Onofré mailto:j...@nanthrax.net>> wrote:
Thanks for
ike to see the streaming-flume added back to Apache Spark.
Thanks,
Kostas
--
Jean-Baptiste Onofré
jbono...@apache.org
http://blog.nanthrax.net
Talend - http://www.talend.com
-
To unsubscribe, e-mail: dev-unsubscr...@spark.apac
from your reply that it's not possible for a single
project to have multiple repos?
--
Jean-Baptiste Onofré
jbono...@apache.org
http://blog.nanthrax.net
Talend - http://www.talend.com
-
To unsubscribe, e-mai
management platform (Bedrock) and self-service data preparation solution
(Mica) leverage Spark for
fast execution of transformations and data exploration.
--
Jean-Baptiste Onofré
jbono...@apache.org
http://blog.nanthrax.
park.zip anymore?
Is there an easy way of keeping these things within the ASF Spark
project? I think that would be better for everybody.
--
Jean-Baptiste Onofré
jbono...@apache.org
http://blog.nanthrax.net
Talend - http://www.talend.com
/
Follow me at https://twitter.com/jaceklaskowski
-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org
--
Jean-Baptiste Onofré
jbono...@apache.org
http://blog.nanth
Baptiste Onofré wrote:
Heads up: I just updated my local copy, it looks better now (the build is so
far so good). I keep you posted.
Regards
JB
On 01/11/2016 01:59 PM, Jean-Baptiste Onofré wrote:
I confirm: I have the same issue.
I tried Josh's PR and but the branch is not found:
g
Heads up: I just updated my local copy, it looks better now (the build
is so far so good). I keep you posted.
Regards
JB
On 01/11/2016 01:59 PM, Jean-Baptiste Onofré wrote:
I confirm: I have the same issue.
I tried Josh's PR and but the branch is not found:
git pull https://githu
I confirm: I have the same issue.
I tried Josh's PR and but the branch is not found:
git pull https://github.com/JoshRosen/spark netty-hotfix
As the issue is on Flume external, I'm not sure it's related.
Let me take a look and eventually provide a fix.
Regards
JB
--
Jean-
2.7. Some libraries that Spark depend
on stopped supporting 2.6. We can still convince the library maintainers
to support 2.6, but it will be extra work. I'm curious if anybody still
uses Python 2.6 to run Spark.
Thanks.
--
Jean-Baptiste Onofré
jbono...@apache.org
http://blog.nanthra
<https://issues.apache.org/jira/browse/SPARK-11724>).
* With the improved query planner for queries having distinct
aggregations (SPARK-9241
<https://issues.apache.org/jira/browse/SPARK-9241>), the plan of a
query having a single distinct aggregation has been ch
ds (SPARK-11724
<https://issues.apache.org/jira/browse/SPARK-11724>).
* With the improved query planner for queries having distinct
aggregations (SPARK-9241
<https://issues.apache.org/jira/browse/SPARK-9241>), the plan of a
query having a single distinct aggregation has
+1 (non binding)
Tested with different samples.
RegardsJB
Sent from my Samsung device
Original message
From: Michael Armbrust
Date: 12/12/2015 18:39 (GMT+01:00)
To: dev@spark.apache.org
Subject: [VOTE] Release Apache Spark 1.6.0 (RC2)
Please vote on releasing the
, keep only Hadoop 2.6 and greater.
What are the community's thoughts on that?
--
Jean-Baptiste Onofré
jbono...@apache.org
http://blog.nanthrax.net
Talend - http://www.talend.com
-
To unsubscribe, e-mail: dev-uns
akes place when you compute the DStream right?
--
Jean-Baptiste Onofré
jbono...@apache.org
http://blog.nanthrax.net
Talend - http://www.talend.com
-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org
wapping is harder
The proposal is to just avoid a single fat jar.
--
Jean-Baptiste Onofré
jbono...@apache.org
http://blog.nanthrax.net
Talend - http://www.talend.com
-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org
>>>> classes isolated to a few public packages, and these public
packages should
>>>> have minimal private classes for low level developer APIs.
>>>>
>>>> 5. Consolidate task metric and accumulator API. Although
having so
rsations recently about Spark 2.0, and
I know I and others have a number of ideas about it already.
I'll go ahead and make 1.7.0, but thought I'd ask, how much other
interest is there in starting to plan Spark 2.0? is that even on the
table as the next release after 1.6?
Sean
--
Jean
reporting any regressions.
What justifies a -1 vote for this release?
-1 vote should occur for regressions from Spark 1.5.1. Bugs already
present in 1.5.1 will not block this release.
--
Jean-Baptiste
Hi Ted,
thanks for the update. The build with sbt is in progress on my box.
Regards
JB
On 11/03/2015 03:31 PM, Ted Yu wrote:
Interesting, Sbt builds were not all failing:
https://amplab.cs.berkeley.edu/jenkins/job/Spark-Master-SBT/
FYI
On Tue, Nov 3, 2015 at 5:58 AM, Jean-Baptiste Onofré
On Tue, Nov 3, 2015 at 1:37 PM, Jean-Baptiste Onofré wrote:
Thanks for the update, I used mvn to build but without hive profile.
Let me try with mvn with the same options as you and sbt also.
I keep you posted.
Regards
JB
On 11/03/2015 12:55 PM, Jeff Zhang wrote:
I found it is due to SPARK
clean compile -Pyarn -Phadoop-2.6 -Phive -Phive-thriftserver
-Psparkr
On Tue, Nov 3, 2015 at 7:52 PM, Jean-Baptiste Onofré mailto:j...@nanthrax.net>> wrote:
Hi Jeff,
it works for me (with skipping the tests).
Let me try again, just to be sure.
Regards
JB
On 11/03
Users/jzhang/github/spark_apache/core/src/main/scala/org/apache/spark/SecurityManager.scala:384:
not found: value HashCodes
[error] val cookie = HashCodes.fromBytes(secret).toString()
[error] ^
--
Best Regards
Jeff Zhang
--
Jean-Baptiste Onofré
jbono...@apache.org
!
Michael
--
Jean-Baptiste Onofré
jbono...@apache.org
http://blog.nanthrax.net
Talend - http://www.talend.com
-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org
//masterip:7077, it is not working….
Any type of help would be appreciated….. Thanks in advance
--
Jean-Baptiste Onofré
jbono...@apache.org
http://blog.nanthrax.net
Talend - http://www.talend.com
-
To unsubscribe, e-mail: de
ither an IP address or
a hostname should work
<https://github.com/apache/spark/blob/4ace4f8a9c91beb21a0077e12b75637a4560a542/conf/spark-env.sh.template#L49>,
but my testing shows that only hostnames work.
Nick
--
Jean-Baptiste Onofré
jbono...@apache.
xecute goal org.apache.maven.plugins:maven-shade-plugin:2.4.1:shade
(default) on project spark-network-common_2.10: Error creating shaded jar
: C:\Users\Annabel\git\spark\network\common\dependency-reduced-pom.xml
(Access i
s denied) -> [Help 1]
Any idea, I'm running a 64-bit Windows 8 machin
36 matches
Mail list logo