Thanks Gopal. I am on Spark 1.6.1 and getting the following error
scala> var conn = LlapContext.newInstance(sc, hs2_url); <console>:28: error: not found: value LlapContext var conn = LlapContext.newInstance(sc, hs2_url); Dr Mich Talebzadeh LinkedIn * https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw <https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>* http://talebzadehmich.wordpress.com *Disclaimer:* Use it at your own risk. Any and all responsibility for any loss, damage or destruction of data or any other property which may arise from relying on this email's technical content is explicitly disclaimed. The author will in no case be liable for any monetary damages arising from such loss, damage or destruction. On 1 August 2016 at 22:53, Gopal Vijayaraghavan <gop...@apache.org> wrote: > > > > Spark fails reading this table. What options do I have here? > > Would your issue be the same as > https://issues.apache.org/jira/browse/SPARK-13129? > > > LLAPContext in Spark can read those tables with ACID semantics (as in > delete/updates will work right). > > var conn = LlapContext.newInstance(sc, hs2_url); > var df: DataFrame = conn.sql("select * from payees").persist(); > > Please be aware that's entirely in auto-commit mode, so you will be > getting lazy snapshot isolation (hence, persist is a good idea). > > Even though "payees" is a placeholder, but this approach is intended for > tables like that which have multiple consumers, the practical reason to > use this pathway would be to apply specific masking/filtering by accessing > user (like hide amounts or just fit amounts into ranges, like 0-99, 99-999 > etc instead of actual values for compliance audits without creating > complete copies). > > Cheers, > Gopal > > >