Re: SchemaRDD to Hbase

2014-12-20 Thread Alex Kamil
I'm using JDBCRDD + Hbase JDBC driver + schemaRDD make sure to use spark 1.2 On Sat, Dec 20

Re: Creating a front-end for output from Spark/PySpark

2014-11-23 Thread Alex Kamil
Alaa, one option is to use Spark as a cache, importing subset of data from hbase/phoenix that fits in memory, and using jdbcrdd to get more data on cache miss. The front end can be created with pyspark and flusk, either as rest api translating json requests to sparkSQL dialect, or simply allowing

Re: Spark SQL with Apache Phoenix lower and upper Bound

2014-11-21 Thread Alex Kamil
Ali, just create a BIGINT column with numeric values in phoenix and use sequences to populate it automatically I included the setup below in case someone starts from scratch Prerequisites: - export JAVA_HOME, SCALA_HOME and install sbt - install hbase i