son
Sent: 07 January 2019 12:32
To: yeikel valdes ; Shashikant Bangera
Cc: user@spark.apache.org
Subject: RE: [EXTERNAL] RE: Re: Spark Kinesis Connector SSL issue
Hi,
The issue is that the KCL inside the Spark Streaming connector does not provide
a way to pass KCL configuration in, which means we
@spark.apache.org
Subject: [EXTERNAL] Re: Spark Kinesis Connector SSL issue
CAUTION EXTERNAL EMAIL
DO NOT open attachments or click on links from unknown senders or unexpected
emails.
Can you call this service with regular code(No Spark)?
On Mon, 07 Jan 2019 02:42:48 -0800
email: eppdev...@discover.com<mailto:eppdev...@discover.com>
Tel: +44 (0)
Mob: +44 (0) 7440783885
From: yeikel valdes [mailto:em...@yeikel.com]
Sent: 07 January 2019 12:15
To: Shashikant Bangera
Cc: user@spark.apache.org
Subject: [EXTERNAL] Re: Spark Kinesis Connector SSL issue
CAUTION EXTERNAL
Can you call this service with regular code(No Spark)?
On Mon, 07 Jan 2019 02:42:48 -0800 shashikantbang...@discover.com wrote
Hi team,
please help , we are kind of blocked here.
Cheers,
Shashi
--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/
-
Hi team,
please help , we are kind of blocked here.
Cheers,
Shashi
--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/
-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org
Hi Team,
we are trying access the endpoint thought library mentioned below and we get
the SSL error i think internally it use KCL library. so if I have to skip the
certificate is it possible through KCL utils call ? because I do not find any
provision to do that to set no-verify=false within spa
imits
(spark.streaming.rateLimit) to prevent spark from receiving data faster
that it can process.
On Mon, Aug 10, 2015 at 4:40 PM, Phil Kallos wrote:
> Hi! Sorry if this is a repost.
>
> I'm using Spark + Kinesis ASL to process and persist stream data to
> ElasticSearch. For the most part it work
Hi! Sorry if this is a repost.
I'm using Spark + Kinesis ASL to process and persist stream data to
ElasticSearch. For the most part it works nicely.
There is a subtle issue I'm running into about how failures are handled.
For example's sake, let's say I am processing a
.
I believe there is some work to fix this problem like Kafka direct API.
Not sure if this is it : https://issues.apache.org/jira/browse/SPARK-9215
Thanks,
Patanachai
On 08/06/2015 12:08 PM, phibit wrote:
Hi! I'm using Spark + Kinesis ASL to process and persist stream data to
ElasticSearc
Hi! I'm using Spark + Kinesis ASL to process and persist stream data to
ElasticSearch. For the most part it works nicely.
There is a subtle issue I'm running into about how failures are handled.
For example's sake, let's say I am processing a Kinesis stream that produces
40
Thanks Chris! I was just looking to get back to Spark + Kinesis
integration. Will be in touch shortly.
Vadim
ᐧ
On Sun, May 10, 2015 at 12:14 AM, Chris Fregly wrote:
> hey vadim-
>
> sorry for the delay.
>
> if you're interested in trying to get Kinesis working one-on-one,
n this.
thanks!
-chris
On Tue, Apr 7, 2015 at 6:17 PM, Vadim Bichutskiy wrote:
> Hey y'all,
>
> While I haven't been able to get Spark + Kinesis integration working, I
> pivoted to plan B: I now push data to S3 where I set up a DStream to
> monitor an S3 bucket with textFi
Hey Chris!
I was happy to see the documentation outlining that issue :-) However, I
must have got into a pretty terrible state because I had to delete and
recreate the kinesis streams as well as the DynamoDB tables.
Thanks for the reply, everything is sorted.
Mike
On Fri, May 8, 2015 at 7:55
hey mike-
as you pointed out here from my docs, changing the stream name is sometimes
problematic due to the way the Kinesis Client Library manages leases and
checkpoints, etc in DynamoDB.
I noticed this directly while developing the Kinesis connector which is why I
highlighted the issue here.
- [Kinesis stream name]: The Kinesis stream that this streaming
application receives from
- The application name used in the streaming context becomes the
Kinesis application name
- The application name must be unique for a given account and region.
- The Kinesis backe
Hi All,
I am submitting the assembled fat jar file by the command:
bin/spark-submit --jars /spark-streaming-kinesis-asl_2.10-1.3.0.jar --class
com.xxx.Consumer -0.1-SNAPSHOT.jar
It reads the data file from kinesis using the stream name defined in a
configuration file. It turns out that it re
Hey y'all,
While I haven't been able to get Spark + Kinesis integration working, I
pivoted to plan B: I now push data to S3 where I set up a DStream to
monitor an S3 bucket with textFileStream, and that works great.
I <3 Spark!
Best,
Vadim
ᐧ
On Mon, Apr 6, 2015 at 12:23 PM, Vad
trois.com/e1t/c/5/f18dQhb0S7lC8dDMPbW2n0x6l2B9nMJW7t5XZs653q_MN8rBNbzRbv22W8r4TLx56dCDWf13Gc8R02?t=https%3A%2F%2Fgithub.com%2Fapache%2Fspark%2Fblob%2Fmaster%2Fextras%2Fkinesis-asl%2Fsrc%2Fmain%2Fscala%2Forg%2Fapache%2Fspark%2Fexamples%2Fstreaming%2FKinesisWordCountASL.scala&si=5533377798602752&pi=8898f7f0-ede6-4e
s checkpoint
>> interval. Same as batchInterval for this example. */val
>> kinesisCheckpointInterval = batchInterval/* Create the same number of
>> Kinesis DStreams/Receivers as Kinesis stream's shards */val
>> kinesisStreams = (0 until numStreams).map { i =>
&
eams = ssc.union(kinesisStreams).map(byteArray => new
> String(byteArray))unionStreams.print()ssc.start()
> ssc.awaitTermination() }}*
>
>
> On Fri, Apr 3, 2015 at 3:48 PM, Tathagata Das wrote:
>
>> Just remove "provided" for spark-streaming-kinesis-asl
>
ix it?
>> ᐧ
>>
>> On Fri, Apr 3, 2015 at 3:43 PM, Kelly, Jonathan
>> wrote:
>>
>>> spark-streaming-kinesis-asl is not part of the Spark distribution on
>>> your cluster, so you cannot have it be just a "provided" dependency. This
>>>
reaming-kinesis-asl is not part of the Spark distribution on
>>>> your cluster, so you cannot have it be just a "provided" dependency. This
>>>> is also why the KCL and its dependencies were not included in the assembly
>>>> (but yes, they should
y should be).
>>>
>>>
>>> ~ Jonathan Kelly
>>>
>>> From: Vadim Bichutskiy
>>> Date: Friday, April 3, 2015 at 12:26 PM
>>> To: Jonathan Kelly
>>> Cc: "user@spark.apache.org"
>>> Subject: Re: Spark + Kines
t;mailto:user@spark.apache.org>"
mailto:user@spark.apache.org>>
Subject: Re: Spark + Kinesis
Thanks. So how do I fix it?
[https://mailfoogae.appspot.com/t?sender=admFkaW0uYmljaHV0c2tpeUBnbWFpbC5jb20%3D&type=zerocontent&guid=51a86f6a-7130-4760-aab3-f4368d8176b9]ᐧ
On Fri,
y the KCL and its dependencies were not included in the assembly
>> (but yes, they should be).
>>
>>
>> ~ Jonathan Kelly
>>
>> From: Vadim Bichutskiy
>> Date: Friday, April 3, 2015 at 12:26 PM
>> To: Jonathan Kelly
>> Cc: "user@spark.apache.org"
s were not included in the assembly
> (but yes, they should be).
>
>
> ~ Jonathan Kelly
>
> From: Vadim Bichutskiy
> Date: Friday, April 3, 2015 at 12:26 PM
> To: Jonathan Kelly
> Cc: "user@spark.apache.org"
> Subject: Re: Spark + Kinesis
>
> Hi
Bichutskiy
mailto:vadim.bichuts...@gmail.com>>
Date: Friday, April 3, 2015 at 12:26 PM
To: Jonathan Kelly mailto:jonat...@amazon.com>>
Cc: "user@spark.apache.org<mailto:user@spark.apache.org>"
mailto:user@spark.apache.org>>
Subject: Re: Spark + Kinesis
Hi all,
Good n
Dependencies += "org.apache.spark" %%
> "spark-streaming-kinesis-asl" % "1.3.0"
>
> I think that may get you a little closer, though I think you're probably
> going to run into the same problems I ran into in this thread:
> https://www.mail-archive.
that may get you a little closer, though I think you're probably
> going to run into the same problems I ran into in this thread:
> https://www.mail-archive.com/user@spark.apache.org/msg23891.html I never
> really got an answer for that, and I temporarily moved on to other things
msg23891.html I never
really got an answer for that, and I temporarily moved on to other things for
now.
~ Jonathan Kelly
From: 'Vadim Bichutskiy'
mailto:vadim.bichuts...@gmail.com>>
Date: Thursday, April 2, 2015 at 9:53 AM
To: "user@spark.apache.org<mailto:user@spark
Hi all,
I am trying to write an Amazon Kinesis consumer Scala app that processes
data in the
Kinesis stream. Is this the correct way to specify *build.sbt*:
---
*import AssemblyKeys._*
*name := "Kinesis Consumer"*
*version := "1.0"organization := "com.myconsumer"scalaVersion :=
"2.11.5"
31 matches
Mail list logo