An update on this. It appears that the phenomenon I'm seeing is that disk
space is freed on restart, but it's not due files getting deleted on
restart, but instead files are getting truncated on restart. It appears
that log files get pre-allocated to a larger size than is used right away.
Upon r
Hi all,
I have two questions regarding setting up mirror maker for our cross
cluster replication (DC1 to DC2, for instance)
1. In what use case you would want to specify multiplied consumer configs?
2. It seems like the consumer inside the mirror is a SimpleConsumer. Is it
possible t
Hey Jason,
As Jun says, we haven't seen that issue and no one else has reported that
but it sounds like a bug of some kind.
In 0.7 we don't do any preallocation of anything. The only time files
shrink is during recovery--we re-checksum all messages that may not have
been flushed and if any invali
With SBT you can use 0.8.1-beta built with any of these four Scala versions
in libraryDependencies now
"org.apache.kafka" % "kafka_2.9.2" % "0.8.0-beta1" intransitive()
or
"org.apache.kafka" % "kafka_2.9.1" % "0.8.0-beta1" intransitive()
or
"org.apache.kafka" % "kafka_2.8.2" % "0.8.0-beta1" in
What about the "newer" (not so new anymore) scala version 2.10.0 and up ?
When will it be supported officially ?
Regards,
Dima Gutzeit.
On 15/7/13 9:32 AM, "Joe Stein" wrote:
>With SBT you can use 0.8.1-beta built with any of these four Scala
>versions
>in libraryDependencies now
>
>"org.apac
Excellent
What about the resolver ? sonatype, typeafe. maven artifactory ?
Any plans for a 2.10+ compile as well
On Sun, Jul 14, 2013 at 6:37 PM, Dima Gutzeit wrote:
> What about the "newer" (not so new anymore) scala version 2.10.0 and up ?
> When will it be supported officially ?
>
>
> Regar
SBT uses Maven Central as a default repository (local ivy too).
The artifacts are published to Maven Central so nothing you should have
to-do except to specify the libraryDependencies
In regards to 2.10.X support I took a really quick look at
https://issues.apache.org/jira/browse/KAFKA-717 (lots
Thanks for doing this!
I'm wondering whether there is a reason to prefer one version of scala over
another, if we don't have any other particular scala dependency in our
code. Are the newer versions better/more efficient, some how? We've
essentially been using 2.8.0 so far, which seems to be fin
My company, and I am sure many other companies too, use Kafka in
conjunction with other libraries/frameworks/platforms which may depend on
newer versions of Scala than 2.[8-9].
Scala 2.10 and up has been around for many months already and always
evolving/improving project such as Kafka should keep