Hi Yin,
 Thanks for the suggestion.
I’m not happy about this, and I don’t agree with your position that since it 
wasn’t an “officially” supported feature 
 no harm was done breaking it in the course of implementing SPARK-6908. I would 
still argue that it changed 
and therefore broke .table()’s api.
(As you know, I’ve filed 2 bugs regarding this SPARK-8105 and SPARK-8107)

I’m done complaining about this issue. 
My short term plan is to change my code for 1.4.0 and 
possibility work on a cleaner solution for 1.5.0 that will be acceptable.

Thanks for looking into it and responding to my initial email.

Doug


> On Jun 5, 2015, at 3:36 PM, Yin Huai <yh...@databricks.com> wrote:
> 
> Hi Doug,
> 
> For now, I think you can use "sqlContext.sql("USE databaseName")" to change 
> the current database.
> 
> Thanks,
> 
> Yin
> 
> On Thu, Jun 4, 2015 at 12:04 PM, Yin Huai <yh...@databricks.com> wrote:
> Hi Doug,
> 
> sqlContext.table does not officially support database name. It only supports 
> table name as the parameter. We will add a method to support database name in 
> future.
> 
> Thanks,
> 
> Yin
> 
> On Thu, Jun 4, 2015 at 8:10 AM, Doug Balog <doug.sparku...@dugos.com> wrote:
> Hi Yin,
>  I’m very surprised to hear that its not supported in 1.3 because I’ve been 
> using it since 1.3.0.
> It worked great up until  SPARK-6908 was merged into master.
> 
> What is the supported way to get  DF for a table that is not in the default 
> database ?
> 
> IMHO, If you are not going to support “databaseName.tableName”, 
> sqlContext.table() should have a version that takes a database and a table, ie
> 
> def table(databaseName: String, tableName: String): DataFrame =
>   DataFrame(this, catalog.lookupRelation(Seq(databaseName,tableName)))
> 
> The handling of databases in Spark(sqlContext, hiveContext, Catalog) could be 
> better.
> 
> Thanks,
> 
> Doug
> 
> > On Jun 3, 2015, at 8:21 PM, Yin Huai <yh...@databricks.com> wrote:
> >
> > Hi Doug,
> >
> > Actually, sqlContext.table does not support database name in both Spark 1.3 
> > and Spark 1.4. We will support it in future version.
> >
> > Thanks,
> >
> > Yin
> >
> >
> >
> > On Wed, Jun 3, 2015 at 10:45 AM, Doug Balog <doug.sparku...@dugos.com> 
> > wrote:
> > Hi,
> >
> > sqlContext.table(“db.tbl”) isn’t working for me, I get a 
> > NoSuchTableException.
> >
> > But I can access the table via
> >
> > sqlContext.sql(“select * from db.tbl”)
> >
> > So I know it has the table info from the metastore.
> >
> > Anyone else see this ?
> >
> > I’ll keep digging.
> > I compiled via make-distribution  -Pyarn -phadoop-2.4 -Phive 
> > -Phive-thriftserver
> > It worked for me in 1.3.1
> >
> > Cheers,
> >
> > Doug
> >
> >
> > ---------------------------------------------------------------------
> > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> > For additional commands, e-mail: user-h...@spark.apache.org
> >
> >
> 
> 
> 


---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to