Re: transtition SQLContext to SparkSession

2016-07-18 Thread Michael Armbrust
+ dev, reynold

Yeah, thats a good point.  I wonder if SparkSession.sqlContext should be
public/deprecated?

On Mon, Jul 18, 2016 at 8:37 AM, Koert Kuipers  wrote:

> in my codebase i would like to gradually transition to SparkSession, so
> while i start using SparkSession i also want a SQLContext to be available
> as before (but with a deprecated warning when i use it). this should be
> easy since SQLContext is now a wrapper for SparkSession.
>
> so basically:
> val session = SparkSession.builder.set(..., ...).getOrCreate()
> val sqlc = new SQLContext(session)
>
> however this doesnt work, the SQLContext constructor i am trying to use is
> private. SparkSession.sqlContext is also private.
>
> am i missing something?
>
> a non-gradual switch is not very realistic in any significant codebase,
> and i do not want to create SparkSession and SQLContext independendly (both
> from same SparkContext) since that can only lead to confusion and
> inconsistent settings.
>


Re: Build changes after SPARK-13579

2016-07-18 Thread Michael Gummelt
I just flailed on this a bit before finding this email.  Can someone please
update
https://cwiki.apache.org/confluence/display/SPARK/Useful+Developer+Tools#UsefulDeveloperTools-IDESetup

On Mon, Apr 4, 2016 at 10:01 PM, Reynold Xin  wrote:

> pyspark and R
>
> On Mon, Apr 4, 2016 at 9:59 PM, Marcelo Vanzin 
> wrote:
>
>> No, tests (except pyspark) should work without having to package anything
>> first.
>>
>> On Mon, Apr 4, 2016 at 9:58 PM, Koert Kuipers  wrote:
>> > do i need to run sbt package before doing tests?
>> >
>> > On Mon, Apr 4, 2016 at 11:00 PM, Marcelo Vanzin 
>> wrote:
>> >>
>> >> Hey all,
>> >>
>> >> We merged  SPARK-13579 today, and if you're like me and have your
>> >> hands automatically type "sbt assembly" anytime you're building Spark,
>> >> that won't work anymore.
>> >>
>> >> You should now use "sbt package"; you'll still need "sbt assembly" if
>> >> you require one of the remaining assemblies (streaming connectors,
>> >> yarn shuffle service).
>> >>
>> >>
>> >> --
>> >> Marcelo
>> >>
>> >> -
>> >> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
>> >> For additional commands, e-mail: dev-h...@spark.apache.org
>> >>
>> >
>>
>>
>>
>> --
>> Marcelo
>>
>> -
>> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
>> For additional commands, e-mail: dev-h...@spark.apache.org
>>
>>
>


-- 
Michael Gummelt
Software Engineer
Mesosphere


Re: transtition SQLContext to SparkSession

2016-07-18 Thread Reynold Xin
Good idea.

https://github.com/apache/spark/pull/14252



On Mon, Jul 18, 2016 at 12:16 PM, Michael Armbrust 
wrote:

> + dev, reynold
>
> Yeah, thats a good point.  I wonder if SparkSession.sqlContext should be
> public/deprecated?
>
> On Mon, Jul 18, 2016 at 8:37 AM, Koert Kuipers  wrote:
>
>> in my codebase i would like to gradually transition to SparkSession, so
>> while i start using SparkSession i also want a SQLContext to be available
>> as before (but with a deprecated warning when i use it). this should be
>> easy since SQLContext is now a wrapper for SparkSession.
>>
>> so basically:
>> val session = SparkSession.builder.set(..., ...).getOrCreate()
>> val sqlc = new SQLContext(session)
>>
>> however this doesnt work, the SQLContext constructor i am trying to use
>> is private. SparkSession.sqlContext is also private.
>>
>> am i missing something?
>>
>> a non-gradual switch is not very realistic in any significant codebase,
>> and i do not want to create SparkSession and SQLContext independendly (both
>> from same SparkContext) since that can only lead to confusion and
>> inconsistent settings.
>>
>
>


ApacheCon: Getting the word out internally

2016-07-18 Thread Melissa Warnkin
ApacheCon: Getting the word out internally
Dear Apache Enthusiast,

As you are no doubt already aware, we will be holding ApacheCon in
Seville, Spain, the week of November 14th, 2016. The call for papers
(CFP) for this event is now open, and will remain open until
September 9th.

The event is divided into two parts, each with its own CFP. The first
part of the event, called Apache Big Data, focuses on Big Data
projects and related technologies.

Website: http://events.linuxfoundation.org/events/apache-big-data-europe
CFP:
http://events.linuxfoundation.org/events/apache-big-data-europe/program/cfp

The second part, called ApacheCon Europe, focuses on the Apache
Software Foundation as a whole, covering all projects, community
issues, governance, and so on.

Website: http://events.linuxfoundation.org/events/apachecon-europe
CFP: http://events.linuxfoundation.org/events/apachecon-europe/program/cfp

ApacheCon is the official conference of the Apache Software
Foundation, and is the best place to meet members of your project and
other ASF projects, and strengthen your project's community.

If your organization is interested in sponsoring ApacheCon, contact Rich Bowen
at e...@apache.orgĀ  ApacheCon is a great place to find the brightest
developers in the world, and experts on a huge range of technologies.

I hope to see you in Seville!
==

Melissaon behalf of the ApacheCon Team