an issue 3 - 4 PR, spark dev community is really active :)
it seems currently spark-shell takes only some SUBMISSION_OPTS, but no
APPLICATION_OPTS
do you have plan to add some APPLICATION_OPTS or CLI_OPTS like
hive -e
hive -f
hive -hivevar
then we can use our scala code as scripts, run them dire
Just opened a PR based on the branch Patrick mentioned for this issue
https://github.com/apache/spark/pull/1864
On Sat, Aug 9, 2014 at 6:48 AM, Patrick Wendell wrote:
> Cheng Lian also has a fix for this. I've asked him to make a PR - he
> is on China time so it probably won't come until tonigh
Cheng Lian also has a fix for this. I've asked him to make a PR - he
is on China time so it probably won't come until tonight:
https://github.com/liancheng/spark/compare/apache:master...liancheng:spark-2894
On Fri, Aug 8, 2014 at 3:46 PM, Sandy Ryza wrote:
> Hi Chutium,
>
> This is currently bei
Hi Chutium,
This is currently being addressed in
https://github.com/apache/spark/pull/1825
-Sandy
On Fri, Aug 8, 2014 at 2:26 PM, chutium wrote:
> no one use spark-shell in master branch?
>
> i created a PR as follow up commit of SPARK-2678 and PR #1801:
>
> https://github.com/apache/spark/pu
no one use spark-shell in master branch?
i created a PR as follow up commit of SPARK-2678 and PR #1801:
https://github.com/apache/spark/pull/1861
--
View this message in context:
http://apache-spark-developers-list.1001551.n3.nabble.com/spark-shell-is-broken-bad-option-master-tp7778p7780.html
maybe this commit is the reason?
https://github.com/apache/spark/commit/a6cd31108f0d73ce6823daafe8447677e03cfd13
i fand some discuss in its PR: https://github.com/apache/spark/pull/1801
important is what vanzin said:
https://github.com/apache/spark/pull/1801#issuecomment-51545117
i tried to use