Looks like it. I'm guessing this didn't make the cut for 0.9.1, and will
instead be included with 1.0.0.

So would you access it just by calling sc.version from the shell? And will
this automatically make it into the Python API?

I'll mark the JIRA issue as resolved.


On Thu, Apr 10, 2014 at 5:05 PM, Patrick Wendell <pwend...@gmail.com> wrote:

> I think this was solved in a recent merge:
>
>
> https://github.com/apache/spark/pull/204/files#diff-364713d7776956cb8b0a771e9b62f82dR779
>
> Is that what you are looking for? If so, mind marking the JIRA as resolved?
>
>
> On Wed, Apr 9, 2014 at 3:30 PM, Nicholas Chammas <
> nicholas.cham...@gmail.com> wrote:
>
>> Hey Patrick,
>>
>> I've created SPARK-1458<https://issues.apache.org/jira/browse/SPARK-1458> to
>> track this request, in case the team/community wants to implement it in the
>> future.
>>
>> Nick
>>
>>
>> On Sat, Feb 22, 2014 at 7:25 PM, Nicholas Chammas <
>> nicholas.cham...@gmail.com> wrote:
>>
>>> No use case at the moment.
>>>
>>> What prompted the question: I was going to ask a different question on
>>> this list and wanted to note my version of Spark. I assumed there would be
>>> a getVersion method on SparkContext or something like that, but I couldn't
>>> find one in the docs. I also couldn't find an environment variable with the
>>> version. After futzing around a bit I realized it was printed out (quite
>>> conspicuously) in the shell startup banner.
>>>
>>>
>>> On Sat, Feb 22, 2014 at 7:15 PM, Patrick Wendell <pwend...@gmail.com>wrote:
>>>
>>>> AFIAK - We don't have any way to do this right now. Maybe we could add
>>>> a getVersion method to SparkContext that would tell you. Just
>>>> wondering - what is the use case here?
>>>>
>>>> - Patrick
>>>>
>>>> On Sat, Feb 22, 2014 at 4:04 PM, nicholas.chammas
>>>> <nicholas.cham...@gmail.com> wrote:
>>>> > Is there a programmatic way to tell what version of Spark I'm running?
>>>> >
>>>> > I know I can look at the banner when the Spark shell starts up, but
>>>> I'm
>>>> > curious to know if there's another way.
>>>> >
>>>> > Nick
>>>> >
>>>> >
>>>> > ________________________________
>>>> > View this message in context: programmatic way to tell Spark version
>>>> > Sent from the Apache Spark User List mailing list archive at
>>>> Nabble.com.
>>>>
>>>
>>>
>>
>

Reply via email to