Github user qqsun8819 commented on the pull request:
https://github.com/apache/spark/pull/110#issuecomment-37497727
thanks @aarondav It doesn't matter. And welcome any advice for this patch
---
If your project is set up for it, you can reply to this email and have your
reply appear o
Github user aarondav commented on the pull request:
https://github.com/apache/spark/pull/110#issuecomment-37432422
This looks good to me, but I will leave this PR for a little longer in case
anyone wants to raise questions about changing the behavior here.
---
If your project is set
Github user qqsun8819 commented on the pull request:
https://github.com/apache/spark/pull/110#issuecomment-37407753
modify patch according to @aarondav 's review
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your pro
Github user qqsun8819 commented on a diff in the pull request:
https://github.com/apache/spark/pull/110#discussion_r10463110
--- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
@@ -1204,7 +1204,7 @@ object SparkContext extends Logging {
master match {
Github user aarondav commented on a diff in the pull request:
https://github.com/apache/spark/pull/110#discussion_r10442282
--- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
@@ -1204,7 +1204,7 @@ object SparkContext extends Logging {
master match {
Github user ash211 commented on the pull request:
https://github.com/apache/spark/pull/110#issuecomment-37154887
I find that new users often wonder why Spark is only using 1 core, and it's
because they expected local to use all their cores rather than defaulting to
just one. Changing
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/110#issuecomment-37121987
Can one of the admins verify this patch?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your proj
GitHub user qqsun8819 opened a pull request:
https://github.com/apache/spark/pull/110
SPARK-1099:Spark's local mode should probably respect spark.cores.max by
default
This is for JIRA:https://spark-project.atlassian.net/browse/SPARK-1099
And this is what I do in this patch (also