2.10:1.4.0 --master yarn-client -i
>>>> /TestDivya/Spark/WriteToPheonix.scala*
>>>>
>>>>
>>>> Getting the below error :
>>>>
>>>> org.apache.spark.sql.AnalysisException:
>>>> org.apache.phoenix.spark.DefaultSource
&g
ed schemas.;
>>
>> Am I on the right track or missing any properties ?
>>
>> Because of this I am unable to proceed with Phoenix and have to find
>> alternate options.
>> Would really appreciate the help
>>
>>
>>
>>
>>
>>
k/WriteToPheonix.scala*
>>>
>>>
>>> Getting the below error :
>>>
>>> org.apache.spark.sql.AnalysisException:
>>> org.apache.phoenix.spark.DefaultSource
>>> does not allow user-specified schemas.;
>>>
>>> Am I on the right track or missing any properties ?
below error :
>>
>> org.apache.spark.sql.AnalysisException:
>> org.apache.phoenix.spark.DefaultSource
>> does not allow user-specified schemas.;
>>
>> Am I on the right track or missing any properties ?
>>
>> Because of this I am unable to proceed
Source
> does not allow user-specified schemas.;
>
> Am I on the right track or missing any properties ?
>
> Because of this I am unable to proceed with Phoenix and have to find
> alternate options.
> Would really appreciate the help
>
>
>
>
>
> -- Forwa
From: Divya Gehlot
Date: 8 April 2016 at 19:54
Subject: Re: [HELP:]Save Spark Dataframe in Phoenix Table
To: Josh Mahonin
Hi Josh,
I am doing in the same manner as mentioned in Phoenix Spark manner.
Using the latest version of HDP 2.3.4 .
In case of version mismatch/lack of spark Phoeni
Reposting for other user benefits
-- Forwarded message --
From: Divya Gehlot
Date: 8 April 2016 at 19:54
Subject: Re: [HELP:]Save Spark Dataframe in Phoenix Table
To: Josh Mahonin
Hi Josh,
I am doing in the same manner as mentioned in Phoenix Spark manner.
Using the latest
Hi Divya,
That's strange. Are you able to post a snippet of your code to look at? And
are you sure that you're saving the dataframes as per the docs (
https://phoenix.apache.org/phoenix_spark.html)?
Depending on your HDP version, it may or may not actually have
phoenix-spark support. Double-check
Hi,
I hava a Hortonworks Hadoop cluster having below Configurations :
Spark 1.5.2
HBASE 1.1.x
Phoenix 4.4
I am able to connect to Phoenix through JDBC connection and able to read
the Phoenix tables .
But while writing the data back to Phoenix table
I am getting below error :
org.apache.spark.sql.