Hi Jerry,
https://issues.apache.org/jira/browse/SPARK-11562 is created for the issue.
Thanks.
Zhan Zhang
On Nov 6, 2015, at 3:01 PM, Jerry Lam
mailto:chiling...@gmail.com>> wrote:
Hi Zhan,
Thank you for providing a workaround!
I will try this out but I agree with Ted, there should be a bette
I agree with minor change. Adding a config to provide the option to init
SQLContext or HiveContext, with HiveContext as default instead of bypassing
when hitting the Exception.
Thanks.
Zhan Zhang
On Nov 6, 2015, at 2:53 PM, Ted Yu
mailto:yuzhih...@gmail.com>> wrote:
I would suggest adding a
Hi Zhan,
Thank you for providing a workaround!
I will try this out but I agree with Ted, there should be a better way to
capture the exception and handle it by just initializing SQLContext instead of
HiveContext. WARN the user that something is wrong with his hive setup.
Having spark.sql.hive
I would suggest adding a config parameter that allows bypassing
initialization of HiveContext in case of SQLException
Cheers
On Fri, Nov 6, 2015 at 2:50 PM, Zhan Zhang wrote:
> Hi Jerry,
>
> OK. Here is an ugly walk around.
>
> Put a hive-site.xml under $SPARK_HOME/conf with invalid content. Yo
Hi Jerry,
OK. Here is an ugly walk around.
Put a hive-site.xml under $SPARK_HOME/conf with invalid content. You will get a
bunch of exceptions because hive context initialization failure, but you can
initialize your SQLContext on your own.
scala> val sqlContext = new org.apache.spark.sql.SQLC
Hi Zhan,
I don’t use HiveContext features at all. I use mostly DataFrame API. It is
sexier and much less typo. :)
Also, HiveContext requires metastore database setup (derby by default). The
problem is that I cannot have 2 spark-shell sessions running at the same time
in the same host (e.g. /hom
If you assembly jar have hive jar included, the HiveContext will be used.
Typically, HiveContext has more functionality than SQLContext. In what case you
have to use SQLContext that cannot be done by HiveContext?
Thanks.
Zhan Zhang
On Nov 6, 2015, at 10:43 AM, Jerry Lam
mailto:chiling...@gmai
What is interesting is that pyspark shell works fine with multiple session in
the same host even though multiple HiveContext has been created. What does
pyspark does differently in terms of starting up the shell?
> On Nov 6, 2015, at 12:12 PM, Ted Yu wrote:
>
> In SQLContext.scala :
> // A
Hi Ted,
I was trying to set spark.sql.dialect to sql as to specify I only need
“SQLContext” not HiveContext. It didn’t work. It still instantiate HiveContext.
Since I don’t use HiveContext and I don’t want to start a mysql database
because I want to have more than 1 session of spark-shell simul
In SQLContext.scala :
// After we have populated SQLConf, we call setConf to populate other
confs in the subclass
// (e.g. hiveconf in HiveContext).
properties.foreach {
case (key, value) => setConf(key, value)
}
I don't see config of skipping the above call.
FYI
On Fri, No
10 matches
Mail list logo