Hi,
> It's a fair point that this would be better. I'll put it on my radar.
Thanks for the consideration. If you need anything reviewed just ask.
Justin
-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional
One thing we can do is to do monthly milestone releases, similar to other
projects (e.g. Scala).
So we can have Apache Spark 2.1.0-M1, Apache Spark 2.1.0-M2.
On Thu, Jun 2, 2016 at 12:42 PM, Tom Graves wrote:
> The documentation for the preview release also seem to be missing?
>
> Also what
The documentation for the preview release also seem to be missing?
Also what happens if we want to do a second preview release? The naming
doesn't seem to allow then unless we call it preview 2.
Tom
On Wednesday, June 1, 2016 6:27 PM, Sean Owen wrote:
On Wed, Jun 1, 2016 at 5:58 PM, Re
Agree, not private really. I thought it might be the smaller audience that
is actually interested but no reason to not share a bit more widely.
On Thu, Jun 2, 2016, 11:53 Mattmann, Chris A (3980) <
chris.a.mattm...@jpl.nasa.gov> wrote:
> +1 this is a good proactive move.
>
> Also a lot of this co
On Thu, Jun 2, 2016 at 10:22 AM, Justin Mclean wrote:
>> Which are unneeded?
>
> For starters permissive licenses are mentioned, NOTICE is for required
> notices only. I'd suggest you ask on general @incubator mailing list for a
> review. People there have knowledge and experience of what should
Hi,
> My understanding of the trademark policy from discussions over the past
> month is that software identifiers like Maven coordinates do not
> strictly require 'apache’.
Yes it's not required, but given the branding issues it may useful to do.
>> - The year in the NOTICE file is out of date
In this case we're just talking about the name of the .tgz archives
that are distributed for download. I agree we would not want to change
the Maven coordinates.
On Thu, Jun 2, 2016 at 9:28 AM, Marcin Tustin wrote:
> Changing the maven co-ordinates is going to cause everyone in the world who
> us
Changing the maven co-ordinates is going to cause everyone in the world who
uses a maven-based build system to have update their builds. Given that sbt
uses ivy by default, that's likely to affect almost every spark user.
Unless we can articulate what the extra legal protections are (and frankly
I
You should set both PYSPARK_DRIVER_PYTHON and PYSPARK_PYTHON the path to
your python interpreter.
2016-06-02 20:32 GMT+07:00 Bhupendra Mishra :
> did not resolved. :(
>
> On Thu, Jun 2, 2016 at 3:01 PM, Sergio Fernández
> wrote:
>
>>
>> On Thu, Jun 2, 2016 at 9:59 AM, Bhupendra Mishra <
>> bh
+dev
On Wed, Jun 1, 2016 at 11:42 PM, Justin Mclean wrote:
> Anyway looking at the preview I noticed a few minor things:
> - Most release artefacts have the word “apache” in them the ones at [1] do
> not. Adding “apache” gives you some extra legal protection.
As to why just 'spark' -- I believe
I think they're deprecated, not necessarily entirely unused. I
personally might leave it, but don't feel strongly about it.
On Thu, Jun 2, 2016 at 4:35 AM, Jacek Laskowski wrote:
> Hi,
>
> While reviewing where SPARK_YARN_MODE is used and how, I found one
> "weird" place where the "yarn-client" i
did not resolved. :(
On Thu, Jun 2, 2016 at 3:01 PM, Sergio Fernández wrote:
>
> On Thu, Jun 2, 2016 at 9:59 AM, Bhupendra Mishra <
> bhupendra.mis...@gmail.com> wrote:
>>
>> and i have already exported environment variable in spark-env.sh as
>> follows.. error still there error: ImportError: N
Hi,
While reviewing where SPARK_YARN_MODE is used and how, I found one
"weird" place where the "yarn-client" is checked against - see
https://github.com/apache/spark/blob/master/repl/scala-2.10/src/main/scala/org/apache/spark/repl/SparkILoop.scala#L946.
Since yarn-client (and yarn-cluster) are no
On Thu, Jun 2, 2016 at 9:59 AM, Bhupendra Mishra wrote:
>
> and i have already exported environment variable in spark-env.sh as
> follows.. error still there error: ImportError: No module named numpy
>
> export PYSPARK_PYTHON=/usr/bin/python
>
According the documentation at
http://spark.apache.o
its RHEL
and i have already exported environment variable in spark-env.sh as
follows.. error still there error: ImportError: No module named numpy
export PYSPARK_PYTHON=/usr/bin/python
thanks
On Thu, Jun 2, 2016 at 12:04 AM, Julio Antonio Soto de Vicente <
ju...@esbet.es> wrote:
> Try adding
15 matches
Mail list logo