I am trying to use hivecontext in spark. The following statements are
running fine :

from pyspark.sql import HiveContext
sqlContext = HiveContext(sc)

But, when i run the below statement,

sqlContext.sql("CREATE TABLE IF NOT EXISTS src (key INT, value STRING)")

I get the following error :

Java Package object not callable

what could be the problem?
thnx

Reply via email to