I tried in spark-shell, and the result is same.
Then, Spark causes this problem. I think Spark doesn't support Apache Tajo.

Cinyoung

2017-09-01 12:00 GMT+09:00 Jianfeng (Jeff) Zhang <jzh...@hortonworks.com>:

>
> Have tried that in spark-shell ?
>
> Best Regard,
> Jeff Zhang
>
>
> From: Cinyoung Hur <cinyoung....@gmail.com>
> Reply-To: "users@zeppelin.apache.org" <users@zeppelin.apache.org>
> Date: Friday, September 1, 2017 at 10:43 AM
> To: "users@zeppelin.apache.org" <users@zeppelin.apache.org>
> Subject: Error in combining data from Tajo and MariaDB with Spark and
> Zeppelin
>
> Hi,
>
> I tried to combine two tables, one from Tajo, and the other from MariaDB.
> My spark interpreter has dependency on "org.apache.tajo:tajo-jdbc:0.11.0".
>
> But, Tajo table doesn't show anything.
> The followings are Spark code and the result.
>
> val componentDF = sqlc.load("jdbc", Map(
>     "url"-> "jdbc:tajo://tajo-master-ip:26002/analysis",
>     "driver"->"org.apache.tajo.jdbc.TajoDriver",
>     "dbtable"->"component_usage_2015"
>     ))
> componentDF.registerTempTable("components")
> val allComponents = sqlContext.sql("select * from components")
> allComponents.show(5)
>
> warning: there was one deprecation warning; re-run with -deprecation for
> details
> componentDF: org.apache.spark.sql.DataFrame = 
> [analysis.component_usage_2015.gnl_nm_cd:
> string, analysis.component_usage_2015.qty: double ... 1 more field]
> warning: there was one deprecation warning; re-run with -deprecation for
> details
> allComponents: org.apache.spark.sql.DataFrame = 
> [analysis.component_usage_2015.gnl_nm_cd:
> string, analysis.component_usage_2015.qty: double ... 1 more field]
> +--------------------------------------------+--------------
> ------------------------+--------------------------------------+
> |analysis.component_usage_2015.gnl_nm_cd|analysis.
> component_usage_2015.qty|analysis.component_usage_2015.amt|
> +--------------------------------------------+--------------
> ------------------------+--------------------------------------+
> +--------------------------------------------+--------------
> ------------------------+--------------------------------------+
>
>
>
>

Reply via email to