ying on this email's technical content is explicitly disclaimed.
> The author will in no case be liable for any monetary damages arising from
> such loss, damage or destruction.
>
>
>
>
> On Thu, 29 Apr 2021 at 20:17, Eric Beabes
> wrote:
>
> > Correct. Question
gt; loss, damage or destruction of data or any other property which may arise
> from relying on this email's technical content is explicitly disclaimed.
> The author will in no case be liable for any monetary damages arising from
> such loss, damage or destruction.
>
>
>
>
chnical content is explicitly disclaimed.
> The author will in no case be liable for any monetary damages arising from
> such loss, damage or destruction.
>
>
>
>
> On Thu, 29 Apr 2021 at 18:35, Eric Beabes
> wrote:
>
> > We're thinking Kafka will allow us to sca
s email's technical content is explicitly disclaimed.
> The author will in no case be liable for any monetary damages arising from
> such loss, damage or destruction.
>
>
>
>
> On Thu, 29 Apr 2021 at 18:07, Eric Beabes
> wrote:
>
> > We’ve a use case wher
We’ve a use case where lots of messages will come in via AWS SQS from
various devices. We’re thinking of reading these messages using Spark
Structured Streaming, cleaning them up as needed & saving each message on
Kafka. Later we’re thinking of using Kafka S3 Connector to push them to S3
on an hour
I've following code in my Kafka Streams application:
*.groupBy((_, myobj) => myobj.getId)(Grouped.`with`[String, Myobj])*
*.windowedBy(SessionWindows.`with`(Duration.ofMillis(10 * 60 * 1000)))*
*.count()*
*.toStream*
*.map((k,v) => (k.key(), v))*
*.to("kafka-streams-test")*
Expectation: If
deserializer(): Deserializer[MyAppObject] = new
MyAppObjectDeserializer
}
On Tue, Dec 1, 2020 at 1:33 PM Eric Beabes wrote:
> I am grouping messages as follows:
>
> .groupBy((_, myAppObject) => myAppObject.getId)(Grouped.`with`[String,
> MyAppObject])
>
>
> I am get
I am grouping messages as follows:
.groupBy((_, myAppObject) => myAppObject.getId)(Grouped.`with`[String,
MyAppObject])
I am getting a message saying.. "No implicits found for parameter
valueSerde: Serde[MyAppObject]
My understanding is I need to add an implicit like this...
implicit val *my
nd use KafkaStreams on the target cluster
>
> - don't use KafkaStreams but plain consumer/producer
>
>
>
> -Matthias
>
> On 12/1/20 10:58 AM, Eric Beabes wrote:
> > I need to read from a topic in one bootstrap server & write it to another
> > topic in ano
I need to read from a topic in one bootstrap server & write it to another
topic in another bootstrap server. Since there's only one
StreamsConfig.BOOTSTRAP_SERVERS_CONFIG property, I am wondering how to
accomplish this?
Do I need to create 2 different KafkaStreams objects? One for reading & the
ot
#x27;s the
next challenge.
Thanks again.
On Fri, Nov 20, 2020 at 12:40 PM Eric Beabes
wrote:
> Wow. This is amazing Daniel. THANKS A LOT!
>
> On Fri, Nov 20, 2020 at 9:44 AM Daniel Hinojosa <
> dhinoj...@evolutionnext.com> wrote:
>
>> By the way. It was even cleaner th
s the
> >> SessionWindowSerde, the last underscore is to include the rest from this
> >> package.
> >>
> >> import org.apache.kafka.streams.scala.Serdes.{timeWindowedSerde => _, _}
> >>
> >> Everything should be in the project, it is now a 2.11.12 and Kafka
&
}
> > > }
> > >
> > >
> > >
> > >
> > > On Thu, Nov 19, 2020 at 7:24 AM John Roesler
> > wrote:
> > >
> > > > Hi Eric,
> > > >
> > > > Sure thing. Assuming the definition of ‘produced’
ormat like JSON or AVRO. Just my two cents.
>
> I hope this helps!
> -John
>
> On Wed, Nov 18, 2020, at 14:02, Eric Beabes wrote:
> > I keep getting '*ambiguous implicit values*' message in the following
> code.
> > I tried several things (as can be seen from
I keep getting '*ambiguous implicit values*' message in the following code.
I tried several things (as can be seen from a couple of lines I've
commented out). Any ideas on how to fix this? This is in *Scala*.
def createTopology(conf: Config, properties: Properties): Topology =
{//implicit val
15 matches
Mail list logo