The names of the directories that are created for the metastore are
different ("metastore" vs "metastore_db"), but that should be it. Really
we should get rid of LocalHiveContext as it is mostly redundant and the
current state is kind of confusing. I've created a JIRA to figure this out
before th
Hi again, and thanks for your reply!
On Fri, Jul 4, 2014 at 8:45 PM, Michael Armbrust wrote:
>
>> Sweet. Any idea about when this will be merged into master?
>
>
> It is probably going to be a couple of weeks. There is a fair amount of
> cleanup that needs to be done. It works though and we use
> Sweet. Any idea about when this will be merged into master?
>
It is probably going to be a couple of weeks. There is a fair amount of
cleanup that needs to be done. It works though and we used it in most of
the demos at the spark summit. Mostly I just need to add tests and move it
out of Hive
On Fri, Jul 4, 2014 at 11:39 AM, Michael Armbrust
wrote:
> On Fri, Jul 4, 2014 at 1:59 AM, Martin Gammelsæter
> wrote:
>>
>> is there any way to write user defined functions for Spark SQL?
> This is coming in Spark 1.1. There is a work in progress PR here:
> https://github.com/apache/spark/pull/
On Fri, Jul 4, 2014 at 1:59 AM, Martin Gammelsæter <
martingammelsae...@gmail.com> wrote:
> is there any way to write user defined functions for Spark SQL?
This is coming in Spark 1.1. There is a work in progress PR here:
https://github.com/apache/spark/pull/1063
If you have a hive context, yo
Ah, sorry for misreading.
I don't think there is a way to use UDF in your SQLs only with SparkSQL.
You might be able to use with SparkHive, but I'm sorry, I don't know well.
I think you should use the function before convert to SchemaRDD if you can.
Thanks.
2014-07-04 18:16 GMT+09:00 Martin
Takuya, thanks for your reply :)
I am already doing that, and it is working well. My question is, can I
define arbitrary functions to be used in these queries?
On Fri, Jul 4, 2014 at 11:12 AM, Takuya UESHIN wrote:
> Hi,
>
> You can convert standard RDD of Product class (e.g. case class) to Schema
Hi,
You can convert standard RDD of Product class (e.g. case class) to
SchemaRDD by SQLContext.
Load data from Cassandra into RDD of case class, convert it to SchemaRDD
and register it,
then you can use it in your SQLs.
http://spark.apache.org/docs/latest/sql-programming-guide.html#running-sql-on
Hi!
I have a Spark cluster running on top of a Cassandra cluster, using
Datastax' new driver, and one of the fields of my RDDs is an
XML-string. In a normal Scala sparkjob, parsing that data is no
problem, but I would like to also make that information available
through Spark SQL. So, is there any