Github user qqsun8819 commented on the pull request:
https://github.com/apache/spark/pull/110#issuecomment-37497727
thanks @aarondav It doesn't matter. And welcome any advice for this patch
---
If your project is set up for it, you can reply to this email and have your
reply a
Github user qqsun8819 commented on the pull request:
https://github.com/apache/spark/pull/110#issuecomment-37407753
modify patch according to @aarondav 's review
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If
Github user qqsun8819 commented on a diff in the pull request:
https://github.com/apache/spark/pull/110#discussion_r10463110
--- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
@@ -1204,7 +1204,7 @@ object SparkContext extends Logging {
master match
GitHub user qqsun8819 opened a pull request:
https://github.com/apache/spark/pull/110
SPARK-1099:Spark's local mode should probably respect spark.cores.max by
default
This is for JIRA:https://spark-project.atlassian.net/browse/SPARK-1099
And this is what I do in this