This is an automated email from the ASF dual-hosted git repository.
acosentino pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/camel.git
The following commit(s) were added to refs/heads/master by this push:
new 59a5d3c Regen
59a5d3c is described below
commit 59a5d3c769ef14257111e505efd8810e4ec1ef42
Author: Andrea Cosentino <[email protected]>
AuthorDate: Tue Mar 12 15:39:31 2019 +0100
Regen
---
.../camel-kafka/src/main/docs/kafka-component.adoc | 6 +++---
components/readme.adoc | 9 ++-------
docs/components/modules/ROOT/pages/kafka-component.adoc | 16 +++++++++-------
3 files changed, 14 insertions(+), 17 deletions(-)
diff --git a/components/camel-kafka/src/main/docs/kafka-component.adoc
b/components/camel-kafka/src/main/docs/kafka-component.adoc
index fdaffc9..5d75855 100644
--- a/components/camel-kafka/src/main/docs/kafka-component.adoc
+++ b/components/camel-kafka/src/main/docs/kafka-component.adoc
@@ -191,7 +191,7 @@ When using Spring Boot make sure to use the following Maven
dependency to have s
----
-The component supports 99 options, which are listed below.
+The component supports 100 options, which are listed below.
@@ -265,12 +265,13 @@ The component supports 99 options, which are listed below.
| *camel.component.kafka.configuration.sasl-jaas-config* | Expose the kafka
sasl.jaas.config parameter Example:
org.apache.kafka.common.security.plain.PlainLoginModule required
username="USERNAME" password="PASSWORD"; | | String
| *camel.component.kafka.configuration.sasl-kerberos-service-name* | The
Kerberos principal name that Kafka runs as. This can be defined either in
Kafka's JAAS config or in Kafka's config. | | String
| *camel.component.kafka.configuration.sasl-mechanism* | The Simple
Authentication and Security Layer (SASL) Mechanism used. For the valid values
see <a href=
"http://www.iana.org/assignments/sasl-mechanisms/sasl-mechanisms.xhtml">http://www.iana.org/assignments/sasl-mechanisms/sasl-mechanisms.xhtml</a>
| GSSAPI | String
-| *camel.component.kafka.configuration.schema-registry-u-r-l* | URL of the
Confluent schema registry servers to use. The format is
host1:port1,host2:port2. This is known as schema.registry.url in the Confluent
documentation. <p/> This option is only available in the Confluent Kafka
product (not standard Apache Kafka) | | String
+| *camel.component.kafka.configuration.schema-registry-u-r-l* | URL of the
Confluent schema registry servers to use. The format is
host1:port1,host2:port2. This is known as schema.registry.url in the Confluent
documentation. This option is only available in the Confluent Kafka product
(not standard Apache Kafka) | | String
| *camel.component.kafka.configuration.security-protocol* | Protocol used to
communicate with brokers. SASL_PLAINTEXT, PLAINTEXT and SSL are supported |
PLAINTEXT | String
| *camel.component.kafka.configuration.seek-to* | Set if KafkaConsumer will
read from beginning or end on startup: beginning : read from beginning end :
read from end This is replacing the earlier property seekToBeginning | | String
| *camel.component.kafka.configuration.send-buffer-bytes* | Socket write
buffer size | 131072 | Integer
| *camel.component.kafka.configuration.serializer-class* | The serializer
class for messages. | org.apache.kafka.common.serialization.StringSerializer |
String
| *camel.component.kafka.configuration.session-timeout-ms* | The timeout used
to detect failures when using Kafka's group management facilities. | 10000 |
Integer
+| *camel.component.kafka.configuration.specific-avro-reader* | This enables
the use of a specific Avro reader for use with the Confluent schema registry
and the io.confluent.kafka.serializers.KafkaAvroDeserializer. This option is
only available in the Confluent Kafka product (not standard Apache Kafka) |
false | Boolean
| *camel.component.kafka.configuration.ssl-cipher-suites* | A list of cipher
suites. This is a named combination of authentication, encryption, MAC and key
exchange algorithm used to negotiate the security settings for a network
connection using TLS or SSL network protocol.By default all the available
cipher suites are supported. | | String
| *camel.component.kafka.configuration.ssl-context-parameters* | SSL
configuration using a Camel {@link SSLContextParameters} object. If configured
it's applied before the other SSL endpoint parameters. | | SSLContextParameters
| *camel.component.kafka.configuration.ssl-enabled-protocols* | The list of
protocols enabled for SSL connections. TLSv1.2, TLSv1.1 and TLSv1 are enabled
by default. | TLSv1.2,TLSv1.1,TLSv1 | String
@@ -286,7 +287,6 @@ The component supports 99 options, which are listed below.
| *camel.component.kafka.configuration.ssl-truststore-location* | The location
of the trust store file. | | String
| *camel.component.kafka.configuration.ssl-truststore-password* | The password
for the trust store file. | | String
| *camel.component.kafka.configuration.ssl-truststore-type* | The file format
of the trust store file. Default value is JKS. | JKS | String
-| *camel.component.kafka.configuration.specificAvroReader* | This enables the
use of a specific Avro reader for use with the Confluent schema registry and
the io.confluent.kafka.serializers.KafkaAvroDeserializer. The default value is
false. | false | boolean
| *camel.component.kafka.configuration.topic* | Name of the topic to use. On
the consumer you can use comma to separate multiple topics. A producer can only
send a message to a single topic. | | String
| *camel.component.kafka.configuration.topic-is-pattern* | Whether the topic
is a pattern (regular expression). This can be used to subscribe to dynamic
number of topics matching the pattern. | false | Boolean
| *camel.component.kafka.configuration.value-deserializer* | Deserializer
class for value that implements the Deserializer interface. |
org.apache.kafka.common.serialization.StringDeserializer | String
diff --git a/components/readme.adoc b/components/readme.adoc
index cca22fb..583ea19 100644
--- a/components/readme.adoc
+++ b/components/readme.adoc
@@ -1,7 +1,7 @@
==== Components
// components: START
-Number of Components: 292 in 227 JAR artifacts (0 deprecated)
+Number of Components: 291 in 226 JAR artifacts (0 deprecated)
[width="100%",cols="4,1,5",options="header"]
|===
@@ -112,9 +112,6 @@ Number of Components: 292 in 227 JAR artifacts (0
deprecated)
| link:camel-aws-sns/src/main/docs/aws-sns-component.adoc[AWS Simple
Notification System] (camel-aws-sns) +
`aws-sns:topicNameOrArn` | 2.8 | The aws-sns component is used for sending
messages to an Amazon Simple Notification Topic.
-| link:camel-aws-sqs/src/main/docs/aws-sqs-component.adoc[AWS Simple Queue
Service] (camel-aws-sqs) +
-`aws-sqs:queueNameOrArn` | 2.6 | The aws-sqs component is used for sending and
receiving messages to Amazon's SQS service.
-
| link:camel-aws-swf/src/main/docs/aws-swf-component.adoc[AWS Simple Workflow]
(camel-aws-swf) +
`aws-swf:type` | 2.13 | The aws-swf component is used for managing workflows
from Amazon Simple Workflow.
@@ -1040,7 +1037,7 @@ Number of Languages: 18 in 9 JAR artifacts (1 deprecated)
==== Miscellaneous Components
// others: START
-Number of Miscellaneous Components: 32 in 32 JAR artifacts (0 deprecated)
+Number of Miscellaneous Components: 31 in 31 JAR artifacts (0 deprecated)
[width="100%",cols="4,1,5",options="header"]
|===
@@ -1050,8 +1047,6 @@ Number of Miscellaneous Components: 32 in 32 JAR
artifacts (0 deprecated)
| link:camel-cdi/src/main/docs/cdi.adoc[CDI] (camel-cdi) | 2.10 | Using Camel
with CDI
-| link:camel-cxf/src/main/docs/cxf.adoc[Cxf] (camel-cxf) | | Camel CXF support
-
| link:camel-cxf-transport/src/main/docs/cxf-transport.adoc[CXF Transport]
(camel-cxf-transport) | 2.8 | Camel Transport for Apache CXF
| link:camel-headersmap/src/main/docs/headersmap.adoc[Headersmap]
(camel-headersmap) | 2.20 | Fast case-insensitive headers map implementation
diff --git a/docs/components/modules/ROOT/pages/kafka-component.adoc
b/docs/components/modules/ROOT/pages/kafka-component.adoc
index 81b06f7..5d75855 100644
--- a/docs/components/modules/ROOT/pages/kafka-component.adoc
+++ b/docs/components/modules/ROOT/pages/kafka-component.adoc
@@ -72,7 +72,7 @@ with the following path and query parameters:
|===
-==== Query Parameters (94 parameters):
+==== Query Parameters (95 parameters):
[width="100%",cols="2,5,^1,2",options="header"]
@@ -108,6 +108,7 @@ with the following path and query parameters:
| *pollTimeoutMs* (consumer) | The timeout used when polling the
KafkaConsumer. | 5000 | Long
| *seekTo* (consumer) | Set if KafkaConsumer will read from beginning or end
on startup: beginning : read from beginning end : read from end This is
replacing the earlier property seekToBeginning | | String
| *sessionTimeoutMs* (consumer) | The timeout used to detect failures when
using Kafka's group management facilities. | 10000 | Integer
+| *specificAvroReader* (consumer) | This enables the use of a specific Avro
reader for use with the Confluent schema registry and the
io.confluent.kafka.serializers.KafkaAvroDeserializer. This option is only
available in the Confluent Kafka product (not standard Apache Kafka) | false |
boolean
| *topicIsPattern* (consumer) | Whether the topic is a pattern (regular
expression). This can be used to subscribe to dynamic number of topics matching
the pattern. | false | boolean
| *valueDeserializer* (consumer) | Deserializer class for value that
implements the Deserializer interface. |
org.apache.kafka.common.serialization.StringDeserializer | String
| *exceptionHandler* (consumer) | To let the consumer use a custom
ExceptionHandler. Notice if the option bridgeErrorHandler is enabled then this
option is not in use. By default the consumer will deal with exceptions, that
will be logged at WARN or ERROR level and ignored. | | ExceptionHandler
@@ -161,16 +162,16 @@ with the following path and query parameters:
| *sslEnabledProtocols* (security) | The list of protocols enabled for SSL
connections. TLSv1.2, TLSv1.1 and TLSv1 are enabled by default. |
TLSv1.2,TLSv1.1,TLSv1 | String
| *sslEndpointAlgorithm* (security) | The endpoint identification algorithm to
validate server hostname using server certificate. | | String
| *sslKeymanagerAlgorithm* (security) | The algorithm used by key manager
factory for SSL connections. Default value is the key manager factory algorithm
configured for the Java Virtual Machine. | SunX509 | String
+| *sslKeyPassword* (security) | The password of the private key in the key
store file. This is optional for client. | | String
+| *sslKeystoreLocation* (security) | The location of the key store file. This
is optional for client and can be used for two-way authentication for client. |
| String
+| *sslKeystorePassword* (security) | The store password for the key store
file.This is optional for client and only needed if ssl.keystore.location is
configured. | | String
| *sslKeystoreType* (security) | The file format of the key store file. This
is optional for client. Default value is JKS | JKS | String
| *sslProtocol* (security) | The SSL protocol used to generate the SSLContext.
Default setting is TLS, which is fine for most cases. Allowed values in recent
JVMs are TLS, TLSv1.1 and TLSv1.2. SSL, SSLv2 and SSLv3 may be supported in
older JVMs, but their usage is discouraged due to known security
vulnerabilities. | TLS | String
| *sslProvider* (security) | The name of the security provider used for SSL
connections. Default value is the default security provider of the JVM. | |
String
| *sslTrustmanagerAlgorithm* (security) | The algorithm used by trust manager
factory for SSL connections. Default value is the trust manager factory
algorithm configured for the Java Virtual Machine. | PKIX | String
+| *sslTruststoreLocation* (security) | The location of the trust store file. |
| String
| *sslTruststoreType* (security) | The file format of the trust store file.
Default value is JKS. | JKS | String
| *schemaRegistryURL* (confluent) | URL of the Confluent schema registry
servers to use. The format is host1:port1,host2:port2. This is known as
schema.registry.url in the Confluent documentation. This option is only
available in the Confluent Kafka product (not standard Apache Kafka) | | String
-| *sslKeyPassword* (security) | The password of the private key in the key
store file. This is optional for client. | | String
-| *sslKeystoreLocation* (security) | The location of the key store file. This
is optional for client and can be used for two-way authentication for client. |
| String
-| *sslKeystorePassword* (security) | The store password for the key store
file.This is optional for client and only needed if ssl.keystore.location is
configured. | | String
-| *sslTruststoreLocation* (security) | The location of the trust store file. |
| String
| *sslTruststorePassword* (security) | The password for the trust store file.
| | String
|===
// endpoint options: END
@@ -190,7 +191,7 @@ When using Spring Boot make sure to use the following Maven
dependency to have s
----
-The component supports 99 options, which are listed below.
+The component supports 100 options, which are listed below.
@@ -264,12 +265,13 @@ The component supports 99 options, which are listed below.
| *camel.component.kafka.configuration.sasl-jaas-config* | Expose the kafka
sasl.jaas.config parameter Example:
org.apache.kafka.common.security.plain.PlainLoginModule required
username="USERNAME" password="PASSWORD"; | | String
| *camel.component.kafka.configuration.sasl-kerberos-service-name* | The
Kerberos principal name that Kafka runs as. This can be defined either in
Kafka's JAAS config or in Kafka's config. | | String
| *camel.component.kafka.configuration.sasl-mechanism* | The Simple
Authentication and Security Layer (SASL) Mechanism used. For the valid values
see <a href=
"http://www.iana.org/assignments/sasl-mechanisms/sasl-mechanisms.xhtml">http://www.iana.org/assignments/sasl-mechanisms/sasl-mechanisms.xhtml</a>
| GSSAPI | String
-| *camel.component.kafka.configuration.schema-registry-u-r-l* | URL of the
Confluent schema registry servers to use. The format is
host1:port1,host2:port2. This is known as schema.registry.url in the Confluent
documentation. <p/> This option is only available in the Confluent Kafka
product (not standard Apache Kafka) | | String
+| *camel.component.kafka.configuration.schema-registry-u-r-l* | URL of the
Confluent schema registry servers to use. The format is
host1:port1,host2:port2. This is known as schema.registry.url in the Confluent
documentation. This option is only available in the Confluent Kafka product
(not standard Apache Kafka) | | String
| *camel.component.kafka.configuration.security-protocol* | Protocol used to
communicate with brokers. SASL_PLAINTEXT, PLAINTEXT and SSL are supported |
PLAINTEXT | String
| *camel.component.kafka.configuration.seek-to* | Set if KafkaConsumer will
read from beginning or end on startup: beginning : read from beginning end :
read from end This is replacing the earlier property seekToBeginning | | String
| *camel.component.kafka.configuration.send-buffer-bytes* | Socket write
buffer size | 131072 | Integer
| *camel.component.kafka.configuration.serializer-class* | The serializer
class for messages. | org.apache.kafka.common.serialization.StringSerializer |
String
| *camel.component.kafka.configuration.session-timeout-ms* | The timeout used
to detect failures when using Kafka's group management facilities. | 10000 |
Integer
+| *camel.component.kafka.configuration.specific-avro-reader* | This enables
the use of a specific Avro reader for use with the Confluent schema registry
and the io.confluent.kafka.serializers.KafkaAvroDeserializer. This option is
only available in the Confluent Kafka product (not standard Apache Kafka) |
false | Boolean
| *camel.component.kafka.configuration.ssl-cipher-suites* | A list of cipher
suites. This is a named combination of authentication, encryption, MAC and key
exchange algorithm used to negotiate the security settings for a network
connection using TLS or SSL network protocol.By default all the available
cipher suites are supported. | | String
| *camel.component.kafka.configuration.ssl-context-parameters* | SSL
configuration using a Camel {@link SSLContextParameters} object. If configured
it's applied before the other SSL endpoint parameters. | | SSLContextParameters
| *camel.component.kafka.configuration.ssl-enabled-protocols* | The list of
protocols enabled for SSL connections. TLSv1.2, TLSv1.1 and TLSv1 are enabled
by default. | TLSv1.2,TLSv1.1,TLSv1 | String