Gordon's suggestion seems like a good way to provide per-job credentials based on application-specific properties. In contrast, Flink's built-in JAAS features are aimed at making the Flink cluster's Kerberos credentials available to jobs.
I want to reiterate that all jobs (for a given Flink cluster) run with full privilege in the JVM, and that Flink does not guarantee isolation (of security information) between jobs. My suggestion is to not run untrusted job code in a shared Flink cluster. On Wed, May 24, 2017 at 8:46 PM, Tzu-Li (Gordon) Tai <tzuli...@apache.org> wrote: > Hi Gwenhael, > > Follow-up for this: > > Turns out what you require is already available with Kafka 0.10, using > dynamic JAAS configurations [1] instead of a static JAAS file like what > you’re currently doing. > > The main thing to do is to set a “sasl.jaas.config” in the config > properties for your individual Kafka consumer / producer. > This will override any static JAAS configuration used. > Note 2 things here: 1) static JAAS configurations are a JVM process-wide > installation, meaning using that any separate Kafka client within the same > process can always only share the same credentials and 2) the “KafkaClient” > is a fixed JAAS lookup section key that the Kafka clients use, which I > don’t think is modifiable. So using the static config approach would never > work. > > An example “sasl.jaas.config” for plain logins: > "org.apache.kafka.common.security.plain.PlainLoginModule required > username=xxxx password=yyyy > > Simply have different values for each of the Kafka consumer / producers > you’re using. > > Cheers, > Gordon > > > On 8 May 2017 at 4:42:07 PM, Tzu-Li (Gordon) Tai (tzuli...@apache.org) > wrote: > > Hi Gwenhael, > > Sorry for the very long delayed response on this. > > As you noticed, the “KafkaClient” entry name seems to be a hardcoded thing > on the Kafka side, so currently I don’t think what you’re asking for is > possible. > > It seems like this could be made possible with some of the new > authentication features in Kafka 0.10 that seems related: [1] [2]. > > I’m not that deep into the authentication modules, but I’ll take a look > and can keep you posted on this. > Also looping in Eron (in CC) who could perhaps provide more insight on > this at the same time. > > Cheers, > Gordon > > [1] https://cwiki.apache.org/confluence/display/KAFKA/KIP- > 83+-+Allow+multiple+SASL+authenticated+Java+clients+in+ > a+single+JVM+process > [2] https://cwiki.apache.org/confluence/display/KAFKA/KIP- > 85%3A+Dynamic+JAAS+configuration+for+Kafka+clients > > On 26 April 2017 at 8:48:20 PM, Gwenhael Pasquiers ( > gwenhael.pasqui...@ericsson.com) wrote: > > Hello, > > Up to now we’ve been using kafka with jaas (plain login/password) the > following way: > > - yarnship the jaas file > > - add the jaas file name into “flink-conf.yaml” using property > “env.java.opts” > > > > How to support multiple secured kafka 0.10 consumers and producers (with > different logins and password of course) ? > > From what I saw in the kafka sources, the entry name “KafkaClient” is > hardcoded… > > Best Regards, > > > > Gwenhaël PASQUIERS > >