+1
-- --
??: "Matei Zaharia";
: 2014??11??6??(??) 3:21
??: "Sean Owen";
: "dev";
: Re: [VOTE] Designating maintainers for some Spark components
Several people asked about having maintainers review the PR queue for th
Who has the idea of machine learning? Spark missing some features for machine
learning, For example, the parameter server.
> 在 2015年11月12日,05:32,Matei Zaharia 写道:
>
> I like the idea of popping out Tachyon to an optional component too to reduce
> the number of dependencies. In the future, it
Github user witgo commented on a diff in the pull request:
https://github.com/apache/spark/pull/44#discussion_r10193837
--- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
@@ -847,6 +847,8 @@ class SparkContext(
partitions: Seq[Int
Github user witgo commented on a diff in the pull request:
https://github.com/apache/spark/pull/44#discussion_r10194313
--- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
@@ -847,6 +847,8 @@ class SparkContext(
partitions: Seq[Int
Github user witgo commented on a diff in the pull request:
https://github.com/apache/spark/pull/44#discussion_r10194372
--- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
@@ -847,6 +847,8 @@ class SparkContext(
partitions: Seq[Int
Github user witgo commented on the pull request:
https://github.com/apache/spark/pull/44#issuecomment-36449283
Yes, add too much require statements is unwise.We guarantee throw an error
when appropriate, and the rest to the developer to resolve.
---
If your project is set up for it
Github user witgo commented on a diff in the pull request:
https://github.com/apache/spark/pull/44#discussion_r10199454
--- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
@@ -950,6 +952,8 @@ class SparkContext(
resultHandler: (Int, U) => U
Github user witgo commented on a diff in the pull request:
https://github.com/apache/spark/pull/44#discussion_r10239276
--- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
@@ -847,6 +847,8 @@ class SparkContext(
partitions: Seq[Int
GitHub user witgo opened a pull request:
https://github.com/apache/spark/pull/70
Update building-with-maven.md
mvn -Dhadoop.version=... -Dsuites=spark.repl.ReplSuite test
to
mvn -Dhadoop.version=... -Dsuites=org.apache.spark.repl.ReplSuite test
You can
Github user witgo commented on a diff in the pull request:
https://github.com/apache/spark/pull/44#discussion_r10242185
--- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
@@ -847,6 +847,8 @@ class SparkContext(
partitions: Seq[Int
Github user witgo commented on a diff in the pull request:
https://github.com/apache/spark/pull/44#discussion_r10242279
--- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
@@ -847,6 +847,8 @@ class SparkContext(
partitions: Seq[Int
Github user witgo commented on a diff in the pull request:
https://github.com/apache/spark/pull/44#discussion_r10242460
--- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
@@ -847,6 +847,8 @@ class SparkContext(
partitions: Seq[Int
Github user witgo commented on a diff in the pull request:
https://github.com/apache/spark/pull/44#discussion_r10243090
--- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
@@ -847,6 +847,7 @@ class SparkContext(
partitions: Seq[Int
Github user witgo commented on a diff in the pull request:
https://github.com/apache/spark/pull/44#discussion_r10243307
--- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
@@ -847,6 +847,7 @@ class SparkContext(
partitions: Seq[Int
Github user witgo commented on a diff in the pull request:
https://github.com/apache/spark/pull/44#discussion_r10284698
--- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
@@ -847,6 +847,8 @@ class SparkContext(
partitions: Seq[Int
Github user witgo commented on a diff in the pull request:
https://github.com/apache/spark/pull/44#discussion_r10370902
--- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
@@ -847,6 +847,8 @@ class SparkContext(
partitions: Seq[Int
Github user witgo commented on the pull request:
https://github.com/apache/spark/pull/25#issuecomment-37157952
Building the current master using Maven has the same compiler error
git checkout 5d98cfc1c8fb17fbbeacc7192ac21c0b038cbd16
mvn -U -Pyarn
Github user witgo commented on the pull request:
https://github.com/apache/spark/pull/25#issuecomment-37162150
Now do not use a proxy also has the same compiler error
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If
Github user witgo commented on the pull request:
https://github.com/apache/spark/pull/145#issuecomment-37713958
Well done, the PR can fix [
SPARK-1248](https://spark-project.atlassian.net/browse/SPARK-1248)
---
If your project is set up for it, you can reply to this email and have
GitHub user witgo opened a pull request:
https://github.com/apache/spark/pull/150
Fix SPARK-1256: Master web UI and Worker web UI returns a 404 error
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/witgo/spark SPARK-1256
In the following PR, there are related discussions.
https://github.com/apache/spark/pull/332
-- Original --
From: "Marcelo Vanzin";;
Date: Fri, Apr 11, 2014 08:16 AM
To: "dev";
Subject: RFC: varargs in Logging.scala?
Hey there,
While going through the
-1
The following bug should be fixed:
https://issues.apache.org/jira/browse/SPARK-1817
https://issues.apache.org/jira/browse/SPARK-1712
-- Original --
From: "Patrick Wendell";;
Date: Wed, May 14, 2014 04:07 AM
To: "dev@spark.apache.org";
Subject: Re: [VOTE]
4, 2014 03:02 PM
To: "dev@spark.apache.org";
Subject: Re: [VOTE] Release Apache Spark 1.0.0 (rc5)
Hey @witgo - those bugs are not severe enough to block the release,
but it would be nice to get them fixed.
At this point we are focused on severe bugs with an immediate fix, or
regre
You need to set:
spark.akka.frameSize 5
spark.default.parallelism1
-- Original --
From: "Madhu";;
Date: Wed, May 14, 2014 09:15 AM
To: "dev";
Subject: Re: [VOTE] Release Apache Spark 1.0.0 (rc5)
I just built rc5 on Windows 7 and tried to re
How to reproduce this bug?
-- Original --
From: "Patrick Wendell";;
Date: Mon, May 19, 2014 10:08 AM
To: "dev@spark.apache.org";
Cc: "Tom Graves";
Subject: Re: [VOTE] Release Apache Spark 1.0.0 (rc9)
Hey Matei - the issue you found is not related to secur
Lack of hard disk space? If yes, you can try
https://github.com/apache/spark/pull/828
-- Original --
From: "Sue Cai";;
Date: Wed, May 21, 2014 03:31 PM
To: "dev";
Subject: MLlib ALS-- Errors communicating with MapOutputTracker
Hello,
I am currently usi
Uh,write my name wrong, right should be Guoqiang Li rather than Guoquiang Li
-- Original --
From: "Kan Zhang";;
Date: Wed, Jun 4, 2014 03:00 AM
To: "dev";
Subject: Re: Add my JIRA username (hsaputra) to Spark's contributor's list
Same here please, userna
-1
The following bug should be fixed:
https://issues.apache.org/jira/browse/SPARK-2677
-- Original --
From: "Tathagata Das";;
Date: Sat, Jul 26, 2014 07:08 AM
To: "dev@spark.apache.org";
Subject: [VOTE] Release Apache Spark 1.0.2 (RC1)
Please vote on r
)
Is that a regression since 1.0.0?
On Jul 27, 2014 10:43 AM, "witgo" wrote:
> -1
> The following bug should be fixed:
> https://issues.apache.org/jira/browse/SPARK-2677
>
>
>
>
>
> -- Original --
> From: "Tathagata D
You can try these commands
./sbt/sbt assembly./sbt/sbt "test-only *.HiveCompatibilitySuite" -Phive
-- Original --
From: "田毅";;
Date: Fri, Aug 1, 2014 05:00 PM
To: "dev";
Subject: How to run specific sparkSQL test with maven
Hi everyone!
Could any
Need a parameter "-Phadoop-2.3"
eg:
./make-distribution.sh -Dhadoop.version=2.3.0-cdh5.0.2
-Dyarn.version=2.3.0-cdh5.0.2 -Phadoop-2.3 -Pyarn
-- --
??: "Debasish Das";
: 2014??8??8??(??) 3:18
??: "Patrick Wendell";
:
There's a related discussion
https://issues.apache.org/jira/browse/SPARK-2815
-- --
??: "Chester Chen";
: 2014??8??21??(??) 7:42
??: "dev";
: Re: is Branch-1.1 SBT build broken for yarn-alpha ?
Just tried on master branc
GitHub user witgo opened a pull request:
https://github.com/apache/spark/pull/25
SPARK-1125: When using a http proxy,the maven build error for Spark Examples
building with maven When using a http proxy, throw Failure to find
org.eclipse.paho:mqtt-client:jar:0.4.0 in
https
Github user witgo commented on the pull request:
https://github.com/apache/spark/pull/25#issuecomment-36218375
Perhaps no one encountered the same problem.Well, let me close this PR.
---
If your project is set up for it, you can reply to this email and have your
reply appear on
Github user witgo closed the pull request at:
https://github.com/apache/spark/pull/25
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is
GitHub user witgo opened a pull request:
https://github.com/apache/spark/pull/44
fix #SPARK-1149 Bad partitioners can cause Spark to hang
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/witgo/spark SPARK-1149
Alternatively you
Github user witgo commented on a diff in the pull request:
https://github.com/apache/spark/pull/44#discussion_r10166854
--- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
@@ -847,6 +847,8 @@ class SparkContext(
partitions: Seq[Int
37 matches
Mail list logo