thank you.I see this sql in the spark doc:
http://spark.apache.org/docs/1.6.1/sql-programming-guide.html
-- --
??: "Takeshi Yamamuro";;
: 2016??7??26??(??) 6:15
??: "cj"<124411...@qq.com>;
???
thank you.but I hope to read the parquet file as a table in the spark-sql,not
in the java(or scala) program.
-- Original --
From: "Kabeer Ahmed";;
Date: Mon, Jul 25, 2016 10:36 PM
To: "cj"<124411...@qq.com>;
Cc: "user"
hi,all:
I use spark1.6.1 as my work env.
when I saved the following content as test1.sql file :
CREATE TEMPORARY TABLE parquetTableUSING org.apache.spark.sql.parquet OPTIONS (
path "examples/src/main/resources/people.parquet" ) SELECT * FROM parquetTable
and use bin/spar
Hi,
I have the following Spark streaming application which stream from a Kafka
topic, do some processing and publish the result to another topic. In
between I am reading records from a Cassandra CF.
The issue is when the application is running, if a new row is inserted into
Cassandra CF, that n