Hi Todd,

We have not got a chance to update it. We will update it after 1.5 release.

Thanks,

Yin

On Thu, Aug 13, 2015 at 6:49 AM, Todd <bit1...@163.com> wrote:

> Hi,
> I got a question about the spark-sql-perf project by Databricks at 
> https://github.com/databricks/spark-sql-perf/
>
>
> The Tables.scala (
> https://github.com/databricks/spark-sql-perf/blob/master/src/main/scala/com/databricks/spark/sql/perf/bigdata/Tables.scala)
> and BigData (
> https://github.com/databricks/spark-sql-perf/blob/master/src/main/scala/com/databricks/spark/sql/perf/bigdata/BigData.scala)
> are  empty files.
> Is this by intention or this is a bug.
> Also,the code snippet as follows in the README.MD won't compile  as there
> is no Tables class defined in the org.apache.spark.sql.parquet package:
> (I am using Spark1.4.1, is the code compatible with Spark 1.4.1?)
>
> import org.apache.spark.sql.parquet.Tables
> // Tables in TPC-DS benchmark used by experiments.
> val tables = Tables(sqlContext)
> // Setup TPC-DS experiment
> val tpcds = new TPCDS (sqlContext = sqlContext)
>
>
>

Reply via email to