Re: Spark 1.6.1: Unexpected partition behavior?

2016-06-26 Thread Randy Gelhausen
Sorry, please ignore the above. I now see I called coalesce on a different reference, than I used to register the table. On Sun, Jun 26, 2016 at 6:34 PM, Randy Gelhausen wrote: > > val enriched_web_logs = sqlContext.sql(""" > select web_logs.datetime, web_logs.node as app_host, source_ip, b.no

Spark 1.6.1: Unexpected partition behavior?

2016-06-26 Thread Randy Gelhausen
val enriched_web_logs = sqlContext.sql(""" select web_logs.datetime, web_logs.node as app_host, source_ip, b.node as source_host, log from web_logs left outer join (select distinct node, address from nodes) b on source_ip = address """) enriched_web_logs.coalesce(1).write.format("parquet").mode("o

Re: Spark 1.6.1 packages on S3 corrupt?

2016-04-12 Thread Nicholas Chammas
Yes, this is a known issue. The core devs are already aware of it. [CC dev] FWIW, I believe the Spark 1.6.1 / Hadoop 2.6 package on S3 is not corrupt. It may be the only 1.6.1 package that is not corrupt, though. :/ Nick On Tue, Apr 12, 2016 at 9:00 PM Augustus Hong wrote: > Hi all, >

Re: Spark 1.6.1 Hadoop 2.6 package on S3 corrupt?

2016-04-11 Thread Ted Yu
Gentle ping: spark-1.6.1-bin-hadoop2.4.tgz from S3 is still corrupt. On Wed, Apr 6, 2016 at 12:55 PM, Josh Rosen wrote: > Sure, I'll take a look. Planning to do full verification in a bit. > > On Wed, Apr 6, 2016 at 12:54 PM Ted Yu wrote: > >> Josh: >> Can you ch

Re: Spark 1.6.1 Hadoop 2.6 package on S3 corrupt?

2016-04-06 Thread Josh Rosen
Sure, I'll take a look. Planning to do full verification in a bit. On Wed, Apr 6, 2016 at 12:54 PM Ted Yu wrote: > Josh: > Can you check spark-1.6.1-bin-hadoop2.4.tgz ? > > $ tar zxf spark-1.6.1-bin-hadoop2.4.tgz > > gzip: stdin: not in gzip format > tar: Child returne

Re: Spark 1.6.1 Hadoop 2.6 package on S3 corrupt?

2016-04-06 Thread Ted Yu
Josh: Can you check spark-1.6.1-bin-hadoop2.4.tgz ? $ tar zxf spark-1.6.1-bin-hadoop2.4.tgz gzip: stdin: not in gzip format tar: Child returned status 1 tar: Error is not recoverable: exiting now $ ls -l !$ ls -l spark-1.6.1-bin-hadoop2.4.tgz -rw-r--r--. 1 hbase hadoop 323614720 Apr 5 19:25

Re: Spark 1.6.1 Hadoop 2.6 package on S3 corrupt?

2016-04-06 Thread Nicholas Chammas
Thank you Josh! I confirmed that the Spark 1.6.1 / Hadoop 2.6 package on S3 is now working, and the SHA512 checks out. On Wed, Apr 6, 2016 at 3:19 PM Josh Rosen wrote: > I downloaded the Spark 1.6.1 artifacts from the Apache mirror network and > re-uploaded them to the spark-related-packa

Re: Spark 1.6.1 Hadoop 2.6 package on S3 corrupt?

2016-04-06 Thread Josh Rosen
I downloaded the Spark 1.6.1 artifacts from the Apache mirror network and re-uploaded them to the spark-related-packages S3 bucket, so hopefully these packages should be fixed now. On Mon, Apr 4, 2016 at 3:37 PM Nicholas Chammas wrote: > Thanks, that was the command. :thumbsup: > > On

Re: Spark 1.6.1 Hadoop 2.6 package on S3 corrupt?

2016-04-04 Thread Nicholas Chammas
erved off of CloudFront > (i.e. > >> the “direct download” option on spark.apache.org) are also corrupt. > >> > >> Btw what’s the correct way to verify the SHA of a Spark package? I’ve > tried > >> a few commands on working packages downloaded from Apache

Re: Spark 1.6.1 Hadoop 2.6 package on S3 corrupt?

2016-04-04 Thread Jakob Odersky
age? I’ve tried >> a few commands on working packages downloaded from Apache mirrors, but I >> can’t seem to reproduce the published SHA for spark-1.6.1-bin-hadoop2.6.tgz. >> >> >> On Mon, Apr 4, 2016 at 11:45 AM Ted Yu wrote: >>> >>> Maybe temporarily ta

Re: Spark 1.6.1 Hadoop 2.6 package on S3 corrupt?

2016-04-04 Thread Jakob Odersky
ify the SHA of a Spark package? I’ve tried > a few commands on working packages downloaded from Apache mirrors, but I > can’t seem to reproduce the published SHA for spark-1.6.1-bin-hadoop2.6.tgz. > > > On Mon, Apr 4, 2016 at 11:45 AM Ted Yu wrote: >> >> Maybe temporarily take

Re: Spark 1.6.1 Hadoop 2.6 package on S3 corrupt?

2016-04-04 Thread Nicholas Chammas
can’t seem to reproduce the published SHA for spark-1.6.1-bin-hadoop2.6.tgz <http://www.apache.org/dist/spark/spark-1.6.1/spark-1.6.1-bin-hadoop2.6.tgz.sha> . ​ On Mon, Apr 4, 2016 at 11:45 AM Ted Yu wrote: > Maybe temporarily take out the artifacts on S3 before the root cause is > found

Re: Spark 1.6.1 Hadoop 2.6 package on S3 corrupt?

2016-04-04 Thread Ted Yu
ackages? It's still a problem. >> >> Also, it would be good to understand why this is happening. >> >> On Fri, Mar 18, 2016 at 6:49 PM Jakob Odersky wrote: >> >>> I just realized you're using a different download site. Sorry for the >>&

Re: Spark 1.6.1 Hadoop 2.6 package on S3 corrupt?

2016-04-04 Thread Kousuke Saruta
looked that. Thanks. Kousuke On 2016/04/04 22:58, Nicholas Chammas wrote: This is still an issue. The Spark 1.6.1 packages on S3 are corrupt. Is anyone looking into this issue? Is there anything contributors can do to help solve this problem? Nick On Sun, Mar 27, 2016

Re: Spark 1.6.1 Hadoop 2.6 package on S3 corrupt?

2016-04-04 Thread Jitendra Shelar
We can think of using checksum for this kind of issues. On Mon, Apr 4, 2016 at 8:32 PM, Kousuke Saruta wrote: > Oh, I overlooked that. Thanks. > > Kousuke > > > On 2016/04/04 22:58, Nicholas Chammas wrote: > > This is still an issue. The Spark 1.6.1 packages on S3 ar

Re: Spark 1.6.1 Hadoop 2.6 package on S3 corrupt?

2016-04-04 Thread Kousuke Saruta
Oh, I overlooked that. Thanks. Kousuke On 2016/04/04 22:58, Nicholas Chammas wrote: This is still an issue. The Spark 1.6.1 packages on S3 are corrupt. Is anyone looking into this issue? Is there anything contributors can do to help solve this problem? Nick On Sun, Mar 27, 2016 at 8:49

Re: Spark 1.6.1 Hadoop 2.6 package on S3 corrupt?

2016-04-04 Thread Nicholas Chammas
This is still an issue. The Spark 1.6.1 packages on S3 are corrupt. Is anyone looking into this issue? Is there anything contributors can do to help solve this problem? Nick On Sun, Mar 27, 2016 at 8:49 PM Nicholas Chammas wrote: > Pingity-ping-pong since this is still a problem. > >

Spark 1.6.1 binary pre-built for Hadoop 2.6 may be broken

2016-04-04 Thread Kousuke Saruta
Hi all, I noticed the binary pre-build for Hadoop 2.6 which we can download from spark.apache.org/downloads.html (Direct Download) may be broken. I couldn't decompress at least following 4 tgzs with "tar xfzv" command and md5-checksum did't match. * spark-1.6.1-bin-hadoop2

Re: Spark 1.6.1 Hadoop 2.6 package on S3 corrupt?

2016-03-27 Thread Nicholas Chammas
akob Odersky wrote: >>> >>>> I just realized you're using a different download site. Sorry for the >>>> confusion, the link I get for a direct download of Spark 1.6.1 / >>>> Hadoop 2.6 is >>>> http://d3kbcqa49mib13.cloudfront.

Re: Spark 1.6.1 Hadoop 2.6 package on S3 corrupt?

2016-03-24 Thread Michael Armbrust
lso, it would be good to understand why this is happening. >> >> On Fri, Mar 18, 2016 at 6:49 PM Jakob Odersky wrote: >> >>> I just realized you're using a different download site. Sorry for the >>> confusion, the link I get for a direct download of Spark 1.6.

Re: Spark 1.6.1 Hadoop 2.6 package on S3 corrupt?

2016-03-24 Thread Nicholas Chammas
gt;> confusion, the link I get for a direct download of Spark 1.6.1 / >> Hadoop 2.6 is >> http://d3kbcqa49mib13.cloudfront.net/spark-1.6.1-bin-hadoop2.6.tgz >> >> On Fri, Mar 18, 2016 at 3:20 PM, Nicholas Chammas >> wrote: >> > I just retried the Spark 1.6.1 /

Re: error occurs to compile spark 1.6.1 using scala 2.11.8

2016-03-22 Thread Ted Yu
fired the build process by clicking "Rebuild Project" in > "Build" menu in IDEA IDE. > > more info here: > Spark 1.6.1 + scala 2.11.8 + IDEA 15.0.3 + Maven 3.3.3 > > I can build spark 1.6.1 with scala 2.10.4 successf

Re: Spark 1.6.1 Hadoop 2.6 package on S3 corrupt?

2016-03-21 Thread Nicholas Chammas
link I get for a direct download of Spark 1.6.1 / > Hadoop 2.6 is > http://d3kbcqa49mib13.cloudfront.net/spark-1.6.1-bin-hadoop2.6.tgz > > On Fri, Mar 18, 2016 at 3:20 PM, Nicholas Chammas > wrote: > > I just retried the Spark 1.6.1 / Hadoop 2.6 download and got a corrupt > ZIP &

Re: Spark 1.6.1 Hadoop 2.6 package on S3 corrupt?

2016-03-20 Thread Nicholas Chammas
I'm seeing the same. :( On Fri, Mar 18, 2016 at 10:57 AM Ted Yu wrote: > I tried again this morning : > > $ wget > https://s3.amazonaws.com/spark-related-packages/spark-1.6.1-bin-hadoop2.6.tgz > --2016-03-18 07:55:30-- > https://s3.amazonaws.com/spark-related-packages/sp

Re: Spark 1.6.1 Hadoop 2.6 package on S3 corrupt?

2016-03-19 Thread Ted Yu
I tried again this morning : $ wget https://s3.amazonaws.com/spark-related-packages/spark-1.6.1-bin-hadoop2.6.tgz --2016-03-18 07:55:30-- https://s3.amazonaws.com/spark-related-packages/spark-1.6.1-bin-hadoop2.6.tgz Resolving s3.amazonaws.com... 54.231.19.163 ... $ tar zxf spark-1.6.1-bin

Spark 1.6.1 Hadoop 2.6 package on S3 corrupt?

2016-03-19 Thread Nicholas Chammas
https://s3.amazonaws.com/spark-related-packages/spark-1.6.1-bin-hadoop2.6.tgz Does anyone else have trouble unzipping this? How did this happen? What I get is: $ gzip -t spark-1.6.1-bin-hadoop2.6.tgz gzip: spark-1.6.1-bin-hadoop2.6.tgz: unexpected end of file gzip: spark-1.6.1-bin-hadoop2.6.tgz

Re: Spark 1.6.1 Hadoop 2.6 package on S3 corrupt?

2016-03-19 Thread Ted Yu
On Linux, I got: $ tar zxf spark-1.6.1-bin-hadoop2.6.tgz gzip: stdin: unexpected end of file tar: Unexpected EOF in archive tar: Unexpected EOF in archive tar: Error is not recoverable: exiting now On Wed, Mar 16, 2016 at 5:15 PM, Nicholas Chammas < nicholas.cham...@gmail.com> wrote: >

Re: Spark 1.6.1 Hadoop 2.6 package on S3 corrupt?

2016-03-19 Thread Nicholas Chammas
it should be fixed now. > On Mar 16, 2016 5:48 PM, "Nicholas Chammas" > wrote: > >> Looks like the other packages may also be corrupt. I’m getting the same >> error for the Spark 1.6.1 / Hadoop 2.4 package. >> >> >> https://s3.amazonaws.com/s

Re: Spark 1.6.1 Hadoop 2.6 package on S3 corrupt?

2016-03-19 Thread Nicholas Chammas
Looks like the other packages may also be corrupt. I’m getting the same error for the Spark 1.6.1 / Hadoop 2.4 package. https://s3.amazonaws.com/spark-related-packages/spark-1.6.1-bin-hadoop2.4.tgz Nick ​ On Wed, Mar 16, 2016 at 8:28 PM Ted Yu wrote: > On Linux, I got: > > $ tar

Re: Spark 1.6.1 Hadoop 2.6 package on S3 corrupt?

2016-03-19 Thread Ted Yu
Same with hadoop 2.3 tar ball: $ tar zxf spark-1.6.1-bin-hadoop2.3.tgz gzip: stdin: unexpected end of file tar: Unexpected EOF in archive tar: Unexpected EOF in archive tar: Error is not recoverable: exiting now On Wed, Mar 16, 2016 at 5:47 PM, Nicholas Chammas < nicholas.cham...@gmail.

Re: Spark 1.6.1 Hadoop 2.6 package on S3 corrupt?

2016-03-18 Thread Michael Armbrust
Patrick reuploaded the artifacts, so it should be fixed now. On Mar 16, 2016 5:48 PM, "Nicholas Chammas" wrote: > Looks like the other packages may also be corrupt. I’m getting the same > error for the Spark 1.6.1 / Hadoop 2.4 package. > > > https://s3.amazonaws.com/spa

Re: Spark 1.6.1 Hadoop 2.6 package on S3 corrupt?

2016-03-18 Thread Jakob Odersky
( > > On Fri, Mar 18, 2016 at 10:57 AM Ted Yu wrote: >> >> I tried again this morning : >> >> $ wget >> https://s3.amazonaws.com/spark-related-packages/spark-1.6.1-bin-hadoop2.6.tgz >> --2016-03-18 07:55:30-- >> https://s3.amazonaws.com/spark-relate

Re: Spark 1.6.1 Hadoop 2.6 package on S3 corrupt?

2016-03-18 Thread Nicholas Chammas
I just retried the Spark 1.6.1 / Hadoop 2.6 download and got a corrupt ZIP file. Jakob, are you sure the ZIP unpacks correctly for you? Is it the same Spark 1.6.1/Hadoop 2.6 package you had a success with? On Fri, Mar 18, 2016 at 6:11 PM Jakob Odersky wrote: > I just experienced the is

Re: Spark 1.6.1 Hadoop 2.6 package on S3 corrupt?

2016-03-18 Thread Jakob Odersky
I just realized you're using a different download site. Sorry for the confusion, the link I get for a direct download of Spark 1.6.1 / Hadoop 2.6 is http://d3kbcqa49mib13.cloudfront.net/spark-1.6.1-bin-hadoop2.6.tgz On Fri, Mar 18, 2016 at 3:20 PM, Nicholas Chammas wrote: > I just ret

[ANNOUNCE] Announcing Spark 1.6.1

2016-03-10 Thread Michael Armbrust
Spark 1.6.1 is a maintenance release containing stability fixes. This release is based on the branch-1.6 maintenance branch of Spark. We *strongly recommend* all 1.6.0 users to upgrade to this release. Notable fixes include: - Workaround for OOM when writing large partitioned tables SPARK-12546

[RESULT] [VOTE] Release Apache Spark 1.6.1 (RC1)

2016-03-09 Thread Michael Armbrust
[[0m >>>>>>>> ^[[31m at >>>>>>>> org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:253)^[[0m >>>>>>>> ^[[31m at >>>>>>>> org.apache.spark.sql.jdbc.DockerJDBCIntegrationSuite.run(DockerJ

Re: [VOTE] Release Apache Spark 1.6.1 (RC1)

2016-03-09 Thread Michael Armbrust
t; org.apache.spark.sql.jdbc.DockerJDBCIntegrationSuite.run(DockerJDBCIntegrationSuite.scala:58)^[[0m >>>>>>> ^[[31m ...^[[0m >>>>>>> ^[[31m Cause: >>>>>>> com.spotify.docker.client.shaded.javax.ws.rs.ProcessingException: >>>>>>> java.io

Re: [VOTE] Release Apache Spark 1.6.1 (RC1)

2016-03-09 Thread Kousuke Saruta
2016 at 20:00 UTC and passes if a majority of at least 3+1 PMC votes are cast. [ ] +1 Release this package as Apache Spark 1.6.1 [ ] -1 Do not release this package because ...

Re: [VOTE] Release Apache Spark 1.6.1 (RC1)

2016-03-08 Thread Burak Yavuz
;>>>> org.glassfish.jersey.apache.connector.ApacheConnector$1.run(ApacheConnector.java:491)^[[0m >>>>>> ^[[31m at >>>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)^[[0m >>>>>> ^[[31m at >>>>>&

Re: [VOTE] Release Apache Spark 1.6.1 (RC1)

2016-03-08 Thread Andrew Or
ent.Executors$RunnableAdapter.call(Executors.java:471)^[[0m >>>>> ^[[31m at >>>>> java.util.concurrent.FutureTask.run(FutureTask.java:262)^[[0m >>>>> ^[[31m at >>>>> jersey.repackaged.com.google.common.util.concurrent.More

Re: [VOTE] Release Apache Spark 1.6.1 (RC1)

2016-03-08 Thread Yin Huai
a:110)^[[0m >>>> ^[[31m at >>>> jersey.repackaged.com.google.common.util.concurrent.AbstractListeningExecutorService.submit(AbstractListeningExecutorService.java:50)^[[0m >>>> ^[[31m at >>>> jersey.repackaged.com.google.common.util.concurrent.AbstractListening

Re: [VOTE] Release Apache Spark 1.6.1 (RC1)

2016-03-07 Thread Reynold Xin
>> ^[[31m at >>> jersey.repackaged.com.google.common.util.concurrent.AbstractListeningExecutorService.submit(AbstractListeningExecutorService.java:37)^[[0m >>> ^[[31m at >>> org.glassfish.jersey.apache.connector.ApacheConnector.apply(ApacheConnector.java:487)^[[0m &

Re: [VOTE] Release Apache Spark 1.6.1 (RC1)

2016-03-06 Thread Egor Pahomov
[[31m at >> org.glassfish.jersey.apache.connector.ApacheConnector.apply(ApacheConnector.java:487)^[[0m >> ^[[31m at >> org.glassfish.jersey.client.ClientRuntime$2.run(ClientRuntime.java:177)^[[0m >> ^[[31m ...^[[0m >> ^[[31m Cause: java.io.IOException: No such fil

Re: [VOTE] Release Apache Spark 1.6.1 (RC1)

2016-03-03 Thread Yin Yang
st > wrote: > >> Please vote on releasing the following candidate as Apache Spark version >> 1.6.1! >> >> The vote is open until Saturday, March 5, 2016 at 20:00 UTC and passes if >> a majority of at least 3+1 PMC votes are cast. >> >> [ ] +1 Release

Re: [VOTE] Release Apache Spark 1.6.1 (RC1)

2016-03-03 Thread Sean Owen
FWIW I was running this with OpenJDK 1.8.0_66 On Thu, Mar 3, 2016 at 7:43 PM, Tim Preece wrote: > Regarding the failure in > org.apache.spark.streaming.kafka.DirectKafkaStreamSuite","offset recovery > > We have been seeing the very same problem with the IBM JDK for quite a long > time ( since at

Re: [VOTE] Release Apache Spark 1.6.1 (RC1)

2016-03-03 Thread Tim Preece
Regarding the failure in org.apache.spark.streaming.kafka.DirectKafkaStreamSuite","offset recovery We have been seeing the very same problem with the IBM JDK for quite a long time ( since at least July 2015 ). It is intermittent and we had dismissed it as a testcase problem. -- View this mess

Re: [VOTE] Release Apache Spark 1.6.1 (RC1)

2016-03-03 Thread Sean Owen
t; ^[[31m Cause: java.io.IOException: No such file or directory^[[0m > ^[[31m at > jnr.unixsocket.UnixSocketChannel.doConnect(UnixSocketChannel.java:94)^[[0m > > Has anyone seen the above ? > > On Wed, Mar 2, 2016 at 2:45 PM, Michael Armbrust > wrote: >> >> Please

Re: [VOTE] Release Apache Spark 1.6.1 (RC1)

2016-03-03 Thread Yin Yang
at 20:00 UTC and passes if > a majority of at least 3+1 PMC votes are cast. > > [ ] +1 Release this package as Apache Spark 1.6.1 > [ ] -1 Do not release this package because ... > > To learn more about Apache Spark, please see http://spark.apache.org/ > > T

Re: [VOTE] Release Apache Spark 1.6.1 (RC1)

2016-03-03 Thread Tim Preece
I just created the following pull request ( against master but would like on 1.6.1 ) for the isolated classloader fix ( Spark-13648 ) https://github.com/apache/spark/pull/11495 -- View this message in context: http://apache-spark-developers-list.1001551.n3.nabble.com/VOTE-Release-Apache-Spark-

Re: [VOTE] Release Apache Spark 1.6.1 (RC1)

2016-03-03 Thread Tim Preece
I have been testing 1.6.1RC1 using the IBM Java SDK. I notice a problem ( with the org.apache.spark.sql.hive.client.VersionsSuite tests ) after a recent Spark 1.6.1 change. Pull request - https://github.com/apache/spark/commit/f7898f9e2df131fa78200f6034508e74a78c2a44 The change introduced a

Re: [VOTE] Release Apache Spark 1.6.1 (RC1)

2016-03-02 Thread Mark Hamstra
; [ ] +1 Release this package as Apache Spark 1.6.1 > [ ] -1 Do not release this package because ... > > To learn more about Apache Spark, please see http://spark.apache.org/ > > The tag to be voted on is *v1.6.1-rc1 > (15de51c238a7340fa81cb0b80d029a05d97bfc5c) > <https://gith

[VOTE] Release Apache Spark 1.6.1 (RC1)

2016-03-02 Thread Michael Armbrust
Please vote on releasing the following candidate as Apache Spark version 1.6.1! The vote is open until Saturday, March 5, 2016 at 20:00 UTC and passes if a majority of at least 3+1 PMC votes are cast. [ ] +1 Release this package as Apache Spark 1.6.1 [ ] -1 Do not release this package because

Re: Spark 1.6.1

2016-02-26 Thread Josh Rosen
I updated the release packaging scripts to use SFTP via the *lftp* client: https://github.com/apache/spark/pull/11350 I'm starting the process of cutting a 1.6.1-RC1 tag and release artifacts right now, so please be extra careful about merging into branch-1.6 until after the release. Once the RC p

Re: Spark 1.6.1

2016-02-24 Thread Yin Yang
Have you tried using scp ? scp file i...@people.apache.org Thanks On Wed, Feb 24, 2016 at 5:04 PM, Michael Armbrust wrote: > Unfortunately I don't think thats sufficient as they don't seem to support > sftp in the same way they did before. We'll still need to update our > release scripts. > >

Re: Spark 1.6.1

2016-02-24 Thread Michael Armbrust
Unfortunately I don't think thats sufficient as they don't seem to support sftp in the same way they did before. We'll still need to update our release scripts. On Wed, Feb 24, 2016 at 2:09 AM, Yin Yang wrote: > Looks like access to people.apache.org has been restored. > > FYI > > On Mon, Feb 2

Re: Spark 1.6.1

2016-02-24 Thread Yin Yang
Looks like access to people.apache.org has been restored. FYI On Mon, Feb 22, 2016 at 10:07 PM, Luciano Resende wrote: > > > On Mon, Feb 22, 2016 at 9:08 PM, Michael Armbrust > wrote: > >> An update: people.apache.org has been shut down so the release scripts >> are broken. Will try again af

Re: Spark 1.6.1

2016-02-22 Thread Reynold Xin
Yes, we don't want to clutter maven central. The staging repo is included in the release candidate voting thread. See the following for an example: http://apache-spark-developers-list.1001551.n3.nabble.com/VOTE-Release-Apache-Spark-1-6-0-RC1-td15424.html On Mon, Feb 22, 2016 at 11:37 PM, Romi

Re: Spark 1.6.1

2016-02-22 Thread Romi Kuntsman
Sounds fair. Is it to avoid cluttering maven central with too many intermediate versions? What do I need to add in my pom.xml section to make it work? *Romi Kuntsman*, *Big Data Engineer* http://www.totango.com On Tue, Feb 23, 2016 at 9:34 AM, Reynold Xin wrote: > We usually publish to a stag

Re: Spark 1.6.1

2016-02-22 Thread Reynold Xin
We usually publish to a staging maven repo hosted by the ASF (not maven central). On Mon, Feb 22, 2016 at 11:32 PM, Romi Kuntsman wrote: > Is it possible to make RC versions available via Maven? (many projects do > that) > That will make integration much easier, so many more people can test th

Re: Spark 1.6.1

2016-02-22 Thread Romi Kuntsman
Is it possible to make RC versions available via Maven? (many projects do that) That will make integration much easier, so many more people can test the version before the final release. Thanks! *Romi Kuntsman*, *Big Data Engineer* http://www.totango.com On Tue, Feb 23, 2016 at 8:07 AM, Luciano R

Re: Spark 1.6.1

2016-02-22 Thread Luciano Resende
On Mon, Feb 22, 2016 at 9:08 PM, Michael Armbrust wrote: > An update: people.apache.org has been shut down so the release scripts > are broken. Will try again after we fix them. > > If you skip uploading to people.a.o, it should still be available in nexus for review. The other option is to add

Re: Spark 1.6.1

2016-02-22 Thread Michael Armbrust
An update: people.apache.org has been shut down so the release scripts are broken. Will try again after we fix them. On Mon, Feb 22, 2016 at 6:28 PM, Michael Armbrust wrote: > I've kicked off the build. Please be extra careful about merging into > branch-1.6 until after the release. > > On Mon,

Re: Spark 1.6.1

2016-02-22 Thread Michael Armbrust
I've kicked off the build. Please be extra careful about merging into branch-1.6 until after the release. On Mon, Feb 22, 2016 at 10:24 AM, Michael Armbrust wrote: > I will cut the RC today. Sorry for the delay! > > On Mon, Feb 22, 2016 at 5:19 AM, Patrick Woody > wrote: > >> Hey Michael, >>

Re: Spark 1.6.1

2016-02-22 Thread Michael Armbrust
I will cut the RC today. Sorry for the delay! On Mon, Feb 22, 2016 at 5:19 AM, Patrick Woody wrote: > Hey Michael, > > Any update on a first cut of the RC? > > Thanks! > -Pat > > On Mon, Feb 15, 2016 at 6:50 PM, Michael Armbrust > wrote: > >> I'm not going to be able to do anything until after

Re: Spark 1.6.1

2016-02-22 Thread Patrick Woody
Hey Michael, Any update on a first cut of the RC? Thanks! -Pat On Mon, Feb 15, 2016 at 6:50 PM, Michael Armbrust wrote: > I'm not going to be able to do anything until after the Spark Summit, but > I will kick off RC1 after that (end of week). Get your patches in before > then! > > On Sat, Fe

Re: Spark 1.6.1

2016-02-15 Thread Michael Armbrust
I'm not going to be able to do anything until after the Spark Summit, but I will kick off RC1 after that (end of week). Get your patches in before then! On Sat, Feb 13, 2016 at 4:57 PM, Jong Wook Kim wrote: > Is 1.6.1 going to be ready this week? I see that the two last unresolved > issues targ

Re: Spark 1.6.1

2016-02-13 Thread Jong Wook Kim
Is 1.6.1 going to be ready this week? I see that the two last unresolved issues targeting 1.6.1 are fixed now . On 3 February 2016 at 08:16, Daniel Darabos < daniel.dara...@lynxanalytics.com> wrote: > > On Tu

Re: Spark 1.6.1

2016-02-03 Thread Daniel Darabos
On Tue, Feb 2, 2016 at 7:10 PM, Michael Armbrust wrote: > What about the memory leak bug? >> https://issues.apache.org/jira/browse/SPARK-11293 >> Even after the memory rewrite in 1.6.0, it still happens in some cases. >> Will it be fixed for 1.6.1? >> > > I think we have enough issues queued up t

Re: Spark 1.6.1

2016-02-03 Thread Steve Loughran
Ted Yu mailto:yuzhih...@gmail.com>>, "dev@spark.apache.org<mailto:dev@spark.apache.org>" mailto:dev@spark.apache.org>> Subject: Re: Spark 1.6.1 Hi Michael, What about the memory leak bug? https://issues.apache.org/jira/browse/SPARK-11293<https://urldefen

Re: Spark 1.6.1

2016-02-02 Thread Mingyu Kim
Cool, thanks! Mingyu From: Michael Armbrust Date: Tuesday, February 2, 2016 at 10:48 AM To: Mingyu Kim Cc: Romi Kuntsman , Hamel Kothari , Ted Yu , "dev@spark.apache.org" , Punya Biswal , Robert Kruszewski Subject: Re: Spark 1.6.1 I'm waiting for a few last fixes to be

Re: Spark 1.6.1

2016-02-02 Thread Michael Armbrust
; Mingyu > > From: Romi Kuntsman > Date: Tuesday, February 2, 2016 at 3:16 AM > To: Michael Armbrust > Cc: Hamel Kothari , Ted Yu , > "dev@spark.apache.org" > Subject: Re: Spark 1.6.1 > > Hi Michael, > What about the memory leak bu

Re: Spark 1.6.1

2016-02-02 Thread Mingyu Kim
Hi all, Is there an estimated timeline for 1.6.1 release? Just wanted to check how the release is coming along. Thanks! Mingyu From: Romi Kuntsman Date: Tuesday, February 2, 2016 at 3:16 AM To: Michael Armbrust Cc: Hamel Kothari , Ted Yu , "dev@spark.apache.org" Subject: Re: S

Re: Spark 1.6.1

2016-02-02 Thread Michael Armbrust
> > What about the memory leak bug? > https://issues.apache.org/jira/browse/SPARK-11293 > Even after the memory rewrite in 1.6.0, it still happens in some cases. > Will it be fixed for 1.6.1? > I think we have enough issues queued up that I would not hold the release for that, but if there is a pa

Re: Spark 1.6.1

2016-02-02 Thread Romi Kuntsman
y issues but >> it does have some useful new features. It should be fully backwards >> compatible according to the Jackson folks. >> >> On Mon, Feb 1, 2016 at 10:29 AM Ted Yu wrote: >> >>> SPARK-12624 has been resolved. >>> According to Wenchen, SPARK-127

Re: Spark 1.6.1

2016-02-01 Thread Michael Armbrust
ackwards > compatible according to the Jackson folks. > > On Mon, Feb 1, 2016 at 10:29 AM Ted Yu wrote: > >> SPARK-12624 has been resolved. >> According to Wenchen, SPARK-12783 is fixed in 1.6.0 release. >> >> Are there other blockers for Spark 1.6.1 ? >>

Re: Spark 1.6.1

2016-02-01 Thread Hamel Kothari
2783 is fixed in 1.6.0 release. > > Are there other blockers for Spark 1.6.1 ? > > Thanks > > On Wed, Jan 13, 2016 at 5:39 PM, Michael Armbrust > wrote: > >> Hey All, >> >> While I'm not aware of any critical issues with 1.6.0, there are several >&

Re: Spark 1.6.1

2016-02-01 Thread Ted Yu
SPARK-12624 has been resolved. According to Wenchen, SPARK-12783 is fixed in 1.6.0 release. Are there other blockers for Spark 1.6.1 ? Thanks On Wed, Jan 13, 2016 at 5:39 PM, Michael Armbrust wrote: > Hey All, > > While I'm not aware of any critical issues with 1.6.0, the

Re: Spark 1.6.1

2016-01-29 Thread Michael Armbrust
I think this is fixed in branch-1.6 already. If you can reproduce it there can you please open a JIRA and ping me? On Fri, Jan 29, 2016 at 12:16 PM, deenar < deenar.toras...@thinkreactive.co.uk> wrote: > Hi Michael > > The Dataset aggregators do not appear to support complex Spark-SQL types. I >

Re: Spark 1.6.1

2016-01-29 Thread deenar
Hi Michael The Dataset aggregators do not appear to support complex Spark-SQL types. I wasn't sure if I was doing something wrong or if this was a bug or a feature not implemented yet. Having this in would be great. See below (reposting this from the spark user list) https://docs.cloud.databricks

RE: Spark 1.6.1

2016-01-25 Thread Ewan Leith
like that, and you can embed the same code in your own packages, outside of the main Spark releases. Thanks, Ewan -Original Message- From: BrandonBradley [mailto:bradleytas...@gmail.com] Sent: 22 January 2016 14:29 To: dev@spark.apache.org Subject: Re: Spark 1.6.1 I'd like mor

Re: Spark 1.6.1

2016-01-22 Thread BrandonBradley
I'd like more complete Postgres JDBC support for ArrayType before the next release. Some of them are still broken in 1.6.0. It would save me much time. Please see SPARK-12747 @ https://issues.apache.org/jira/browse/SPARK-12747 Cheers! Brandon Bradley -- View this message in context: http://ap

Spark 1.6.1

2016-01-13 Thread Michael Armbrust
Hey All, While I'm not aware of any critical issues with 1.6.0, there are several corner cases that users are hitting with the Dataset API that are fixed in branch-1.6. As such I'm considering a 1.6.1 release. At the moment there are only two critical issues targeted for 1.6.1: - SPARK-12624 -