GitHub user dbtsai opened a pull request:
https://github.com/apache/spark/pull/53
SPARK-1157 L-BFGS Optimizer based on L-BFGS Java implementation in RISO
project.
This will use the L-BFGS java implementation from RISO project (published
in maven central) which is direct translation
Github user witgo commented on a diff in the pull request:
https://github.com/apache/spark/pull/44#discussion_r10194372
--- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
@@ -847,6 +847,8 @@ class SparkContext(
partitions: Seq[Int],
allowLoca
Github user pwendell commented on a diff in the pull request:
https://github.com/apache/spark/pull/44#discussion_r10194343
--- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
@@ -847,6 +847,8 @@ class SparkContext(
partitions: Seq[Int],
allowL
Github user witgo commented on a diff in the pull request:
https://github.com/apache/spark/pull/44#discussion_r10194313
--- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
@@ -847,6 +847,8 @@ class SparkContext(
partitions: Seq[Int],
allowLoca
Github user CodingCat commented on a diff in the pull request:
https://github.com/apache/spark/pull/44#discussion_r10194239
--- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
@@ -847,6 +847,8 @@ class SparkContext(
partitions: Seq[Int],
allow
Github user CodingCat commented on a diff in the pull request:
https://github.com/apache/spark/pull/44#discussion_r10194233
--- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
@@ -847,6 +847,8 @@ class SparkContext(
partitions: Seq[Int],
allow
Github user CodingCat commented on the pull request:
https://github.com/apache/spark/pull/12#issuecomment-36445655
I rebased the code after https://github.com/apache/spark/pull/11 was
merged, and tested in my local side, I think it is ready for further
ready/testing
---
If your proj
Github user witgo commented on a diff in the pull request:
https://github.com/apache/spark/pull/44#discussion_r10193837
--- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
@@ -847,6 +847,8 @@ class SparkContext(
partitions: Seq[Int],
allowLoca
Github user asfgit closed the pull request at:
https://github.com/apache/spark/pull/52
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enable
Github user asfgit closed the pull request at:
https://github.com/apache/spark/pull/11
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enable
Github user markhamstra commented on the pull request:
https://github.com/apache/spark/pull/51#issuecomment-36443418
Ah, I see. create-release.sh was handled in another PR.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well.
Github user pwendell commented on the pull request:
https://github.com/apache/spark/pull/11#issuecomment-36442931
Thanks, merged into master.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have t
Github user markhamstra commented on the pull request:
https://github.com/apache/spark/pull/51#issuecomment-36442877
Looks good. The only remaining incubat* I find are in
dev/create-release/create-release.sh, but I'm not sure how you use that script.
---
If your project is set up fo
Github user pwendell commented on the pull request:
https://github.com/apache/spark/pull/52#issuecomment-36442879
Thanks, merged
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
e
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/51#issuecomment-36442839
Merged build finished.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have t
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/51#issuecomment-36442840
One or more automated tests failed
Refer to this link for build results:
https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/12946/
---
If your pr
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/51#issuecomment-36442811
Merged build started.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have th
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/51#issuecomment-36442810
Merged build triggered.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/52#issuecomment-36442809
Can one of the admins verify this patch?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your proje
GitHub user CodingCat opened a pull request:
https://github.com/apache/spark/pull/52
[SPARK-1150] fix repo location in create script (re-open)
reopen for https://spark-project.atlassian.net/browse/SPARK-1150
You can merge this pull request into a Git repository by running:
$ gi
Github user CodingCat commented on the pull request:
https://github.com/apache/spark/pull/48#issuecomment-36442789
sure, Just reopened, https://github.com/apache/spark/pull/52
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as wel
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/11#issuecomment-36442757
All automated tests passed.
Refer to this link for build results:
https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/12945/
---
If your project i
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/11#issuecomment-36442756
Merged build finished.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have t
Github user pwendell commented on the pull request:
https://github.com/apache/spark/pull/48#issuecomment-36442725
@CodingCat do you mind re-opening this? Something happened and the merge
got screwed up so I had to revert it. Somehow the merge script was pulling in a
different patch (p
Github user asfgit closed the pull request at:
https://github.com/apache/spark/pull/48
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enable
GitHub user pwendell opened a pull request:
https://github.com/apache/spark/pull/51
Remove remaining references to incubation
This removes some loose ends not caught by the other (incubating -> tlp)
patches. @markhamstra this updates the version as you mentioned earlier.
You can me
Github user pwendell commented on the pull request:
https://github.com/apache/spark/pull/48#issuecomment-36441593
thanks, merged into master.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have t
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/11#issuecomment-36441580
Merged build triggered.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/11#issuecomment-36441581
Merged build started.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have th
Github user pwendell commented on the pull request:
https://github.com/apache/spark/pull/11#issuecomment-36441402
Jenkins, test this please.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have th
Github user CodingCat commented on the pull request:
https://github.com/apache/spark/pull/11#issuecomment-36440931
@pwendell done
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
Github user pwendell commented on the pull request:
https://github.com/apache/spark/pull/11#issuecomment-36440024
Hey @CodingCat found a tiny issue but otherwise LGTM - if you patch it I
can merge.
---
If your project is set up for it, you can reply to this email and have your
reply
Github user pwendell commented on a diff in the pull request:
https://github.com/apache/spark/pull/11#discussion_r10193104
--- Diff: core/src/main/scala/org/apache/spark/rdd/PairRDDFunctions.scala
---
@@ -642,7 +643,7 @@ class PairRDDFunctions[K: ClassTag, V: ClassTag](self:
RDD[(
Github user pwendell commented on a diff in the pull request:
https://github.com/apache/spark/pull/44#discussion_r10192944
--- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
@@ -847,6 +847,8 @@ class SparkContext(
partitions: Seq[Int],
allowL
Github user CodingCat commented on the pull request:
https://github.com/apache/spark/pull/11#issuecomment-36437295
@pwendell Thank you again! Just updated the code
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your p
Github user pwendell commented on a diff in the pull request:
https://github.com/apache/spark/pull/17#discussion_r10192610
--- Diff: docs/building-with-maven.md ---
@@ -76,3 +78,11 @@ The maven build includes support for building a Debian
package containing the as
$ mvn -P
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/44#issuecomment-36436962
Merged build finished.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have t
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/44#issuecomment-36436963
All automated tests passed.
Refer to this link for build results:
https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/12944/
---
If your project i
Github user pwendell commented on the pull request:
https://github.com/apache/spark/pull/17#issuecomment-36436518
Hey @ScrapCodes - this is really cool! It's great that we will get this
into Spark 1.0 so that we can support Java 8 lambdas moving forward. I have
some high level suggest
Stash is an enterprise git from atlassian..
I got it...Basically the PRs are managed by github and if I have to work on
a PR, I should rather make use of my github account...
Thanks for the clarification.
On Sat, Mar 1, 2014 at 12:27 PM, Reynold Xin wrote:
> I'm not sure what you mean by ent
Github user mridulm commented on a diff in the pull request:
https://github.com/apache/spark/pull/43#discussion_r10192379
--- Diff: core/src/main/scala/org/apache/spark/storage/DiskStore.scala ---
@@ -84,12 +84,27 @@ private class DiskStore(blockManager: BlockManager,
diskManager:
I'm not sure what you mean by enterprise stash.
But PR is a concept unique to Github. There is no PR model in normal git or
the git ASF maintains.
On Sat, Mar 1, 2014 at 11:28 AM, Debasish Das wrote:
> Hi,
>
> We have a mirror repo of spark at our internal stash.
>
> We are adding changes to a
Github user pwendell commented on a diff in the pull request:
https://github.com/apache/spark/pull/17#discussion_r10192306
--- Diff: docs/building-with-maven.md ---
@@ -76,3 +78,11 @@ The maven build includes support for building a Debian
package containing the as
$ mvn -P
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/44#issuecomment-36435309
Merged build started.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have th
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/44#issuecomment-36435308
Merged build triggered.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user pwendell commented on a diff in the pull request:
https://github.com/apache/spark/pull/17#discussion_r10192279
--- Diff: docs/java-programming-guide.md ---
@@ -30,6 +30,12 @@ There are a few key differences between the Java and
Scala APIs:
classes for key-value p
Github user asfgit closed the pull request at:
https://github.com/apache/spark/pull/27
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enable
Github user pwendell commented on a diff in the pull request:
https://github.com/apache/spark/pull/17#discussion_r10192261
--- Diff: docs/building-with-maven.md ---
@@ -76,3 +78,11 @@ The maven build includes support for building a Debian
package containing the as
$ mvn -P
Github user mateiz commented on the pull request:
https://github.com/apache/spark/pull/44#issuecomment-36435016
Jenkins, this is ok to test
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have thi
Github user mateiz commented on a diff in the pull request:
https://github.com/apache/spark/pull/43#discussion_r10192236
--- Diff: core/src/main/scala/org/apache/spark/storage/DiskStore.scala ---
@@ -84,12 +84,27 @@ private class DiskStore(blockManager: BlockManager,
diskManager: D
We would like to cross-build Spark for Scala 2.11 and 2.10 eventually (they’re
a lot closer than 2.10 and 2.9). In Maven this might mean creating two POMs or
a special variable for the version or something.
Matei
On Mar 1, 2014, at 12:15 PM, Koert Kuipers wrote:
> does maven support cross bui
Github user pwendell commented on a diff in the pull request:
https://github.com/apache/spark/pull/17#discussion_r10192134
--- Diff: docs/java-programming-guide.md ---
@@ -127,11 +132,20 @@ class Split extends FlatMapFunction {
JavaRDD words = lines.flatMap(new Split());
{
Hi,
We have a mirror repo of spark at our internal stash.
We are adding changes to a fork of the mirror so that down the line we can
push the contributions back to Spark git.
I am not sure what's the exact the development methodology we should follow
as things are a bit complicated due to enterp
Github user CodingCat commented on the pull request:
https://github.com/apache/spark/pull/7#issuecomment-36433922
close it
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enable
Github user CodingCat closed the pull request at:
https://github.com/apache/spark/pull/7
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enab
Github user kayousterhout commented on the pull request:
https://github.com/apache/spark/pull/27#issuecomment-36433891
I've merged this into master
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not
Github user pwendell commented on a diff in the pull request:
https://github.com/apache/spark/pull/11#discussion_r10191970
--- Diff: core/src/test/scala/org/apache/spark/FileSuite.scala ---
@@ -208,4 +209,25 @@ class FileSuite extends FunSuite with
LocalSparkContext {
asse
Github user pwendell commented on a diff in the pull request:
https://github.com/apache/spark/pull/11#discussion_r10191952
--- Diff: core/src/main/scala/org/apache/spark/rdd/PairRDDFunctions.scala
---
@@ -712,6 +713,16 @@ class PairRDDFunctions[K: ClassTag, V: ClassTag](self:
RDD[
Github user pwendell commented on the pull request:
https://github.com/apache/spark/pull/32#issuecomment-36432409
Hey @srowen I looked through this and it looks good to me. I think there is
a merge conflict though, mind bringing this up to master? Then I can merge it.
Thanks!
---
If
does maven support cross building for different scala versions?
we do this inhouse all the time with sbt. i know spark does not cross build
at this point, but is it guaranteed to stay that way?
On Sat, Mar 1, 2014 at 12:02 PM, Koert Kuipers wrote:
> i am still unsure what is wrong with sbt ass
i am still unsure what is wrong with sbt assembly. i would like a
real-world example of where it does not work, that i can run.
this is what i know:
1) sbt assembly works fine for version conflicts for an artifact. no
exclusion rules are needed.
2) if artifacts have the same classes inside yet a
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/49#issuecomment-36420274
Merged build finished.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have t
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/49#issuecomment-36420275
All automated tests passed.
Refer to this link for build results:
https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/12943/
---
If your project i
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/49#issuecomment-36419348
Merged build started.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have th
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/49#issuecomment-36419347
Merged build triggered.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
65 matches
Mail list logo