Gordon's suggestion seems like a good way to provide per-job credentials
based on application-specific properties. In contrast, Flink's built-in
JAAS features are aimed at making the Flink cluster's Kerberos credentials
available to jobs.
I want to reiterate that all jobs (for a given Flink clus
Hi Gwenhael,
Follow-up for this:
Turns out what you require is already available with Kafka 0.10, using dynamic
JAAS configurations [1] instead of a static JAAS file like what you’re
currently doing.
The main thing to do is to set a “sasl.jaas.config” in the config properties
for your individ
Hi Gwenhael,
Sorry for the very long delayed response on this.
As you noticed, the “KafkaClient” entry name seems to be a hardcoded thing on
the Kafka side, so currently I don’t think what you’re asking for is possible.
It seems like this could be made possible with some of the new authenticati
Hi Gwenhael,
I'm not a Kafka expert but if something is hardcoded that should not, it
might be worth opening an issue for it. I loop in somebody who might
knows more your problem.
Timo
Am 26/04/17 um 14:47 schrieb Gwenhael Pasquiers:
Hello,
Up to now we’ve been using kafka with jaas (pla
Hello,
Up to now we’ve been using kafka with jaas (plain login/password) the following
way:
- yarnship the jaas file
- add the jaas file name into “flink-conf.yaml” using property
“env.java.opts”
How to support multiple secured kafka 0.10 consumers and producers (with
differ