Chris, here's the content of the files

## distributor file:

bootstrap.servers=broker:9096
group.id=dbz-dev

key.converter=org.apache.kafka.connect.json.JsonConverter
value.converter=org.apache.kafka.connect.json.JsonConverter
key.converter.schemas.enable=false
value.converter.schemas.enable=false

offset.storage.topic=connect-offsets
offset.storage.replication.factor=3
offset.storage.partitions=3

config.storage.topic=connect-configs
config.storage.replication.factor=3

status.storage.topic=connect-status
status.storage.replication.factor=3

# Flush much faster than normal, which is useful for testing/debugging
offset.flush.interval.ms=10000
rest.host.name=fqdn
rest.port=8083
rest.advertised.host.name=fqdn
rest.advertised.port=8083

sasl.mechanism=SCRAM-SHA-512
request.timeout.ms=20000
retry.backoff.ms=500

config.providers=file
config.providers.file.class=org.apache.kafka.common.config.provider.FileConfigProvider

sasl.jaas.config=org.apache.kafka.common.security.scram.ScramLoginModule
required \
username="${file:/app/data/cred/connector_credentials.properties:kuser}"
password="${file:/app/data/cred/connector_credentials.properties:kpassword}";
security.protocol=SASL_SSL

consumer.sasl.mechanism=SCRAM-SHA-512
consumer.request.timeout.ms=300000
consumer.retry.backoff.ms=500
consumer.buffer.memory=2097152
consumer.sasl.jaas.config=org.apache.kafka.common.security.scram.ScramLoginModule
required \
username="${file:/app/data/cred/connector_credentials.properties:kuser}"
password="${file:/app/data/cred/connector_credentials.properties:kpassword}";
consumer.security.protocol=SASL_SSL

producer.sasl.mechanism=SCRAM-SHA-512
producer.request.timeout.ms=300000
producer.retry.backoff.ms=500
producer.buffer.memory=2097152
producer.sasl.jaas.config=org.apache.kafka.common.security.scram.ScramLoginModule
required \
username="${file:/app/data/cred/connector_credentials.properties:kuser}"
password="${file:/app/data/cred/connector_credentials.properties:kpassword}";
producer.security.protocol=SASL_SSL

plugin.path=/app/kafka/plugins
## eof

## connector file
{
  "name": "dbz-panamax-list-domain-general-01",
  "config": {
      "auto.create.topics": "false",
      "binlog.buffer.size": "4048",
      "connector.class": "io.debezium.connector.mysql.MySqlConnector",
      "database.history.consumer.sasl.jaas.config":
"org.apache.kafka.common.security.scram.ScramLoginModule required
username=\"${file:/app/data/cred/connector_credentials.properties:kuser}\"
password=\"${file:/app/data/cred/connector_credentials.properties:kpassword}\";",
      "database.history.consumer.sasl.mechanism": "SCRAM-SHA-512",
      "database.history.consumer.security.protocol": "SASL_SSL",
      "database.history.kafka.bootstrap.servers": "broker:9096",
      "database.history.kafka.topic": "dbhistory.db",
      "database.history.producer.sasl.jaas.config":
"org.apache.kafka.common.security.scram.ScramLoginModule required
username=\"${file:/app/data/cred/connector_credentials.properties:kuser}\"
password=\"${file:/app/data/cred/connector_credentials.properties:kpassword}\";",
      "database.history.producer.sasl.mechanism": "SCRAM-SHA-512",
      "database.history.producer.security.protocol": "SASL_SSL",
      "database.hostname": "host",
      "database.include.list": "db_name",
      "database.password":
"${file:/app/data/cred/connector_credentials.properties:password}",
      "database.port": "9908",
      "database.server.name": "server_name",
      "database.user":
"${file:/app/data/cred/connector_credentials.properties:user}",
      "errors.log.enable": "true",
      "errors.log.include.messages": "true",
      "errors.tolerance": "all",
      "include.schema.changes": "false",
      "signal.data.collection": "dbz.debezium_signal",
      "snapshot.locking.mode": "minimal",
      "snapshot.mode": "initial",
      "table.include.list":
"list.lr_cust_extrnl_prod,list.lr_cust_vndr_info",
      "tasks.max": "1",
      "timestampConverter.format.datetime": "YYYY-MM-dd'T'HH:mm:ss.SSS'Z'",
      "timestampConverter.type":
"oryanmoshe.kafka.connect.util.TimestampConverter",
      "transforms.Reroute.key.enforce.uniqueness": "false",
      "transforms.Reroute.topic.regex": "(.*)",
      "transforms.Reroute.topic.replacement": "list-cdc-generals-02",
      "transforms.Reroute.type":
"io.debezium.transforms.ByLogicalTableRouter",
      "transforms": "Reroute"
  }
}
## eof

thanks

On Mon, Mar 7, 2022 at 2:48 PM Chris Egerton <fearthecel...@gmail.com>
wrote:

> It looks like the file config provider isn't actually set up on the Connect
> worker. What does your Connect worker config look like (usually a file
> called something like connect-distributed.properties)? Feel free to change
> any sensitive values to a string like "<redacted>", but please don't remove
> them entirely (they may be necessary for debugging).
>
> On Mon, Mar 7, 2022 at 4:39 PM Men Lim <zulu...@gmail.com> wrote:
>
> > Thanks for the response Chris.  I went thru the setup again and it
> appeared
> > I might have had a typo somewhere last friday.  Currently, I'm running
> into
> > a file permission issue.
> >
> > the file has the following permissions:
> >
> > -rw-r--r-- 1 adm admn 88 Mar  7 21:23 connector_credentials.properties
> >
> > I have tried changing the pwd to 700 but still the same error:
> >
> > Unable to connect: Access denied for user
> > '${file:/app/data/cred/connector_credentials.prop'@'172.x.x.x' (using
> > password: YES)
> >
> > On Mon, Mar 7, 2022 at 1:55 PM Chris Egerton <fearthecel...@gmail.com>
> > wrote:
> >
> > > Hi Men,
> > >
> > > That config snippet has a small syntax error: all double quotes should
> be
> > > escaped. Assuming you tried something like this:
> > >
> > > "database.history.producer.sasl.jaas.config":
> > > "org.apache.kafka.common.security.scram.ScramLoginModule required
> > > username=\"${file:/path/file.pro:user\"} password=\"${file:/path/
> > file.pro
> > > :password}\";"
> > >
> > > and still ran into issues, we'd probably need to see log files or, at
> the
> > > very least, the stack trace for the task from the REST API (if it
> failed
> > at
> > > all) in order to follow up and provide more help.
> > >
> > > Cheers,
> > >
> > > Chris
> > >
> > > On Mon, Mar 7, 2022 at 3:26 PM Men Lim <zulu...@gmail.com> wrote:
> > >
> > > > Hi Chris,
> > > > I was getting an unauthorized/authentication error message when I was
> > > > trying it out last Friday.  I tried looking for the exact message in
> > the
> > > > connect.log.* files but was not very successful.  In my connector
> > file, I
> > > > have
> > > >
> > > > {
> > > >  "name":"blah",
> > > >  "config": {
> > > >      ...
> > > >      ...
> > > >      "database.history.producer.sasl.jaas.config":
> > > > "org.apache.kafka.common.security.scram.ScramLoginModule required
> > > > username=\"000\" password=\"000000\";",
> > > >      ...
> > > >   }
> > > > }
> > > >
> > > > I changed the database.history.producer.sasl.jaas.config to:
> > > >
> > > > "database.history.producer.sasl.jaas.config":
> > > > "org.apache.kafka.common.security.scram.ScramLoginModule required
> > > > username="${file:/path/file.pro:user"} password="${file:/path/
> file.pro
> > :
> > > > password}";",
> > > >
> > > > On Mon, Mar 7, 2022 at 9:46 AM Chris Egerton <
> fearthecel...@gmail.com>
> > > > wrote:
> > > >
> > > > > Hi Men,
> > > > >
> > > > > The config provider mechanism should work for every property in a
> > > > connector
> > > > > config, and every property in a worker config except for the
> > > plugin.path
> > > > > property (see KAFKA-9845 [1]). You can also use it for only part
> of a
> > > > > single property, or even multiple parts, like in this example
> > > (assuming a
> > > > > config provider named "file"):
> > > > >
> > > > >
> > >
> sasl.jaas.config=org.apache.kafka.common.security.scram.ScramLoginModule
> > > > > required username="${file:/some/file.properties:username}"
> > > > > password="${file:/some/file.properties:password}"
> > > > >
> > > > > What sorts of errors are you seeing when trying to use a config
> > > provider
> > > > > with sasl/scram credentials?
> > > > >
> > > > > [1] - https://issues.apache.org/jira/browse/KAFKA-9845
> > > > >
> > > > > Cheers,
> > > > >
> > > > > Chris
> > > > >
> > > > > On Mon, Mar 7, 2022 at 10:35 AM Men Lim <zulu...@gmail.com> wrote:
> > > > >
> > > > > > Hi all,
> > > > > >
> > > > > > recently, I found out about
> > > > > >
> > > > > > config.providers=file
> > > > > >
> > > > > >
> > > > > >
> > > > >
> > > >
> > >
> >
> config.providers.file.class=org.apache.kafka.common.config.provider.FileConfigProvider
> > > > > >
> > > > > > This works great to remove our embedded database password into an
> > > > > external
> > > > > > file.  However, it does not work when I tried to do the same
> thing
> > > with
> > > > > the
> > > > > > sasl/scram username and password found in the distributor or
> > > connector
> > > > > file
> > > > > > for kafka connect:
> > > > > >
> > > > > >
> > > >
> > sasl.jaas.config=org.apache.kafka.common.security.scram.ScramLoginModule
> > > > > > required \
> > > > > > username="000" password="some_password";
> > > > > >
> > > > > > I was wondering if there's a way to secure these passwords as
> well?
> > > > > >
> > > > > > Thanks,
> > > > > >
> > > > >
> > > >
> > >
> >
>

Reply via email to