Hello,

Good examples on how to interface with DynamoDB from Spark here:

https://aws.amazon.com/blogs/big-data/using-spark-sql-for-etl/
https://aws.amazon.com/blogs/big-data/analyze-your-data-on-amazon-dynamodb-with-apache-spark/

Thanks

On Mon, Dec 12, 2016 at 7:56 PM, Marco Mistroni <mmistr...@gmail.com> wrote:

> Hi
>  If it can help
> 1.Check Java docs of when that method was introduced
> 2. U building a fat jar? Check which libraries have been included....some
> other dependencies might have forced an old copy to be included....
> 3. If u. Take code outside spark.....does it work successfully?
> 4. Send short sample....
> Hth
>
> On 12 Dec 2016 11:03 am, "Pratyaksh Sharma" <pratyaksh.sharma.ece12@itbhu.
> ac.in> wrote:
>
> Hey I am using Apache Spark for one streaming application. I am trying to
> store the processed data into dynamodb using java sdk. Getting the
> following exception -
> 16/12/08 23:23:43 WARN TaskSetManager: Lost task 0.0 in stage 1.0:
> java.lang.NoSuchMethodError: com.amazonaws.SDKGlobalConfigu
> ration.isInRegionOptimizedModeEnabled()Z
> at com.amazonaws.ClientConfigurationFactory.getConfig(ClientCon
> figurationFactory.java:35)
> at com.amazonaws.services.dynamodbv2.AmazonDynamoDBClient.<init
> >(AmazonDynamoDBClient.java:374)
>
> Spark version - 1.6.1
> Scala version - 2.10.5
> aws sdk version - 1.11.33
>
> Has anyone faced this issues? Any help will be highly appreciated.
>
> --
> Regards
>
> Pratyaksh Sharma
> 12105EN013
> Department of Electronics Engineering
> IIT Varanasi
> Contact No +91-8127030223 <+91%2081270%2030223>
>
>
>

Reply via email to