Pierre - I'm not sure that would work. I just opened a Spark shell and did
this:

scala> classOf[SparkContext].getClass.getPackage.getImplementationVersion
res4: String = 1.7.0_25

It looks like this is the JVM version.

- Patrick


On Thu, Apr 10, 2014 at 2:08 PM, Pierre Borckmans <
pierre.borckm...@realimpactanalytics.com> wrote:

> I see that this was fixed using a fixed string in SparkContext.scala.
> Wouldn't it be better to use something like:
>
> getClass.getPackage.getImplementationVersion
>
> to get the version from the jar manifest (and thus from the sbt
> definition)?
>
> The same holds for SparkILoopInit.scala in the welcome message
> (printWelcome).
>
> This would avoid having to modify these strings at each release.
>
> cheers
>
>
>
> *Pierre Borckmans*
>
> *Real**Impact* Analytics *| *Brussels Office
> www.realimpactanalytics.com *| 
> *pierre.borckm...@realimpactanalytics.com<thierry.lib...@realimpactanalytics.com>
>
> *FR *+32 485 91 87 31 *| **Skype* pierre.borckmans
>
>
>
>
>
> On 10 Apr 2014, at 23:05, Patrick Wendell <pwend...@gmail.com> wrote:
>
> I think this was solved in a recent merge:
>
>
> https://github.com/apache/spark/pull/204/files#diff-364713d7776956cb8b0a771e9b62f82dR779
>
> Is that what you are looking for? If so, mind marking the JIRA as resolved?
>
>
> On Wed, Apr 9, 2014 at 3:30 PM, Nicholas Chammas <
> nicholas.cham...@gmail.com> wrote:
>
>> Hey Patrick,
>>
>> I've created SPARK-1458<https://issues.apache.org/jira/browse/SPARK-1458> to
>> track this request, in case the team/community wants to implement it in the
>> future.
>>
>> Nick
>>
>>
>> On Sat, Feb 22, 2014 at 7:25 PM, Nicholas Chammas <
>> nicholas.cham...@gmail.com> wrote:
>>
>>> No use case at the moment.
>>>
>>> What prompted the question: I was going to ask a different question on
>>> this list and wanted to note my version of Spark. I assumed there would be
>>> a getVersion method on SparkContext or something like that, but I couldn't
>>> find one in the docs. I also couldn't find an environment variable with the
>>> version. After futzing around a bit I realized it was printed out (quite
>>> conspicuously) in the shell startup banner.
>>>
>>>
>>> On Sat, Feb 22, 2014 at 7:15 PM, Patrick Wendell <pwend...@gmail.com>wrote:
>>>
>>>> AFIAK - We don't have any way to do this right now. Maybe we could add
>>>> a getVersion method to SparkContext that would tell you. Just
>>>> wondering - what is the use case here?
>>>>
>>>> - Patrick
>>>>
>>>> On Sat, Feb 22, 2014 at 4:04 PM, nicholas.chammas
>>>> <nicholas.cham...@gmail.com> wrote:
>>>> > Is there a programmatic way to tell what version of Spark I'm running?
>>>> >
>>>> > I know I can look at the banner when the Spark shell starts up, but
>>>> I'm
>>>> > curious to know if there's another way.
>>>> >
>>>> > Nick
>>>> >
>>>> >
>>>> > ________________________________
>>>> > View this message in context: programmatic way to tell Spark version
>>>> > Sent from the Apache Spark User List mailing list archive at
>>>> Nabble.com.
>>>>
>>>
>>>
>>
>
>

Reply via email to