Ah, yes. Scala interprets Java collection types, like ArrayList[T], as
*invariant*, which means that you can't use a declared ArrayList[String]
where you expect an ArrayList[Any], which would be an example of *covariance.
*(This is due to Java's flawed way of declaring generics, where the person
*declaring* the collection doesn't have the ability to specify variance
behavior. The *user* decides if a declaration is invariant, covariant, or
contravariant.)

Example using the Scala interpreter:

scala> import java.util.ArrayList
import java.util.ArrayList

scala> var ala: ArrayList[Any] = null               // var so we can
attempt assignments.
ala: java.util.ArrayList[Any] = null

scala> ala = new ArrayList[String]()               // rejected. ala must be
assigned an ArrayList[Any]
<console>:12: error: type mismatch;
 found   : java.util.ArrayList[String]
 required: java.util.ArrayList[Any]
Note: String <: Any, but Java-defined class ArrayList is invariant in type
E.
You may wish to investigate a wildcard type such as `_ <: Any`. (SLS 3.2.10)
       ala = new ArrayList[String]()
             ^

scala> val als = new ArrayList[String]()          // it doesn't work to
declare an ArrayList[String] then try to assign to ala
als: java.util.ArrayList[String] = []

scala> ala = als
<console>:13: error: type mismatch;
 found   : java.util.ArrayList[String]
 required: java.util.ArrayList[Any]
Note: String <: Any, but Java-defined class ArrayList is invariant in type
E.
You may wish to investigate a wildcard type such as `_ <: Any`. (SLS 3.2.10)
       ala = als
             ^

scala> ala = new ArrayList[Any]()               // can only assign an
ArrayList[Any] instance.
ala: java.util.ArrayList[Any] = []

// *************** BUT YOU CAN ASSIGN STRINGS, INTS, ETC. TO THE ARRAY
ITSELF ********************

scala> ala.add(1)
res3: Boolean = true

scala> ala.add("two")
res4: Boolean = true

scala> ala.add(3.3)
res5: Boolean = true

scala> ala
res6: java.util.ArrayList[Any] = [1, two, 3.3]

Note that the type of ala hasn't changed.

Contrast with Scala collections like Seq (List), which is covariant:

scala> var sa: Seq[Any] = null
sa: Seq[Any] = null

scala> sa = List.empty[String]
sa: Seq[Any] = List()

scala> sa = List(1, "two", 3.3)
sa: Seq[Any] = List(1, two, 3.3)

scala> val list = List(1, "two", 3.3)
list: List[Any] = List(1, two, 3.3)

scala> sa = list
sa: Seq[Any] = List(1, two, 3.3)

Note that the type of "sa" doesn't change, even when it's actually pointing
to a subclass that's covariant.


For completeness, there's also *contravariance; *if Foo[T] is
contravariant, then a Foo[String] would be a superclass of Foo[Any]. This
is less common and harder to understand, but an important example is the
argument types for the function types, like Function2[-Arg1, -Arg2,
+Return], e.g.,

scala> var fisa: Function2[Int,String,Any] = null      // a function that
takes Int and String args and returns an Any
fisa: (Int, String) => Any = null

// A function that takes two Any arguments (note that Any is a supertype of
both Int and String), and returns String

scala> val faas: Function2[Any,Any,String] = (a1:Any, a2:Any) =>
a1.toString + a2.toString
faas: (Any, Any) => String = <function2>

scala> fisa = faas                                                  // MIND
BLOWN!!
fisa: (Int, String) => Any = <function2>

scala> fisa(1, "two")
res7: Any = 1two

Why does this work. The Liskov Substitution Principle is really the
correct, technical definition of what we've called the "is a" relationship
in OOP. Fisa defines a contract; what ever function you actually use here,
it must be able to accept an Int and String, and it must guarantee to
return an Any (that's easy). Faas satisfies this contract. It can not only
accept an Int and String, it can accept two Anys. That is, it is more
tolerant to what you pass it. It returns a String only, but that's okay
because the contract only requires that we return an Any.

This is probably waaaay more information than you wanted to here ;) I cover
this in great depth in Programming Scala, 2nd Edition
<http://shop.oreilly.com/product/0636920033073.do> (O'Reilly), because so
few people, even with years of Java or other OO experience understand this
supposedly-core OO principle.

Back to your problem. The upshot is that you can't use the more careful
typing rules you tried. You're stuck with the declarations that can't
enforce the same behavior you want. However, I would do the following:

// trait Protocol defines a protocol for creating stuff of type K
// No + or -, so Protocol is invariant in K
trait Protocol[K] {
  def createStuff(): K
}

import java.util.ArrayList

// KCALB explicitly returns ArrayList[Byte]
object KCALB extends Protocol[ArrayList[Byte]] {
  def createStuff(): ArrayList[Byte] = new ArrayList[Byte]()
}

// KCLL explicitly returns List[Long]
object KCLL extends Protocol[List[Long]] {
  def createStuff(): List[Long] = List.empty[Long]
}

val kcalb = KCALB.createStuff()    // kcalb: java.util.ArrayList[Byte] = []
val kcll  = KCLL.createStuff()     // kcll:  List[Long] = List()

KCALB and KCLL are not interchangeable in any way, but at least each object
constrains the "stuff" returned.

Actually, I would use "type members" instead, if I can't use variance
annotations:


trait Protocol2 {
  type Stuff
  def createStuff(): Stuff
}

object KCALB2 extends Protocol2 {
  type Stuff = ArrayList[Byte]
  def createStuff(): ArrayList[Byte] = new ArrayList[Byte]()
}

object KCLL2 extends Protocol2 {
  type Stuff = List[Long]
  def createStuff(): List[Long] = List.empty[Long]
}

val kcalb2 = KCALB2.createStuff()    // kcalb2: java.util.ArrayList[Byte] =
[]
val kcll2  = KCLL2.createStuff()     // kcll2:  List[Long] = List()


HTH,
Dean

Dean Wampler, Ph.D.
Author: Programming Scala, 2nd Edition
<http://shop.oreilly.com/product/0636920033073.do> (O'Reilly)
Lightbend <http://lightbend.com>
@deanwampler <http://twitter.com/deanwampler>
http://polyglotprogramming.com

On Thu, Sep 8, 2016 at 5:28 PM, Martin Gainty <mgai...@hotmail.com> wrote:

> yes that function compiled but everytime I implement createNewConsumer()
> scala compiler wants to downcast [java.util.ArrayList[Byte]],
> java.util.ArrayList[Byte]] to wildcard ?
>
>       consumer = createNewConsumer() //getConsumer()
>
> [ERROR]  found   : org.apache.kafka.clients.consumer.KafkaConsumer[java.
> util.ArrayList[Byte],java.util.ArrayList[Byte]]
>
> [ERROR]  required: org.apache.kafka.clients.consumer.KafkaConsumer[K,V]
>
> [ERROR] Note: java.util.ArrayList[Byte] >: K, but Java-defined class
> KafkaConsumer is invariant in type K.
>
> [ERROR] You may wish to investigate a wildcard type such as `_ >: K`. (SLS
> 3.2.10)
>
> [ERROR] Note: java.util.ArrayList[Byte] >: V, but Java-defined class
> KafkaConsumer is invariant in type V.
>
> [ERROR] You may wish to investigate a wildcard type such as `_ >: V`. (SLS
> 3.2.10)
>
> can scala actually handle this implementation or should I convert to to
> normal Java and avoid the headache?
>
> Martin
> ______________________________________________
>
>
>
>
> ------------------------------
> From: mgai...@hotmail.com
> To: mathieu.fenn...@replicon.com; users@kafka.apache.org
> Subject: RE: handling generics in Kafka Scala
> Date: Tue, 30 Aug 2016 23:00:29 -0400
>
> noob with Scala so Im looking for an experienced answer
>
> ConsumerGroupCommand.scala
>
> //private def createNewConsumer(): KafkaConsumer[String, String] = {
> //private def createNewConsumer(): KafkaConsumer[K extends
> java.util.ArrayList[Byte],V extends java.util.ArrayList[Byte]] = {
>     private def createNewConsumer(): KafkaConsumer[K <:
> java.util.ArrayList[Byte],V <: java.util.ArrayList[Byte]] = {
>       val properties = new java.util.Properties()
>       val deserializer = (new StringDeserializer).getClass.getName
>       val brokerUrl = opts.options.valueOf(opts.bootstrapServerOpt)
>       properties.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, brokerUrl)
>       properties.put(ConsumerConfig.GROUP_ID_CONFIG,
> opts.options.valueOf(opts.groupOpt))
>       properties.put(ConsumerConfig.ENABLE_AUTO_COMMIT_CONFIG, "false")
>       properties.put(ConsumerConfig.SESSION_TIMEOUT_MS_CONFIG, "30000")
>       properties.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG,
> deserializer)
>       properties.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG,
> deserializer)
>       if (opts.options.has(opts.commandConfigOpt))
> properties.putAll(Utils.loadProps(opts.options.
> valueOf(opts.commandConfigOpt)))
>       new KafkaConsumer(properties).asInstanceOf[KafkaConsumer[K,V]]
>     }
>
> scala-compiler displays:
> [ERROR] \kafka\kafka-trunk\core\src\main\scala\kafka\admin\
> ConsumerGroupCommand.scala:309: error: ']' expected but '<:' found.
> [ERROR]     private def createNewConsumer(): KafkaConsumer[? <:
> java.util.ArrayList[Byte],? <: java.util.ArrayList[Byte]] = {
> [ERROR]                                                      ^
> [ERROR] \kafka\kafka-trunk\core\src\main\scala\kafka\admin\
> ConsumerGroupCommand.scala:309: error: '=' expected but ',' found.
> [ERROR]     private def createNewConsumer(): KafkaConsumer[? <:
> java.util.ArrayList[Byte],? <: java.util.ArrayList[Byte]] = {
> [ERROR]
>                ^
> [ERROR] \kafka\kafka-trunk\core\src\main\scala\kafka\admin\
> ConsumerGroupCommand.scala:322: error: illegal start of simple expression
>
> i want 2 datatype parameter types extending java.util.ArrayList<Byte> in
> regular java this would be:
>
> public KafkaConsumer<K extends java.util.ArrayList<Byte>,V extends
> java.util.ArrayList<Byte>>  createNewConsumer() {
> }
>
> how do I setup a function or class declaration K, V whose parameter
> datatype extends java.util.ArrayList<Byte> ?
>
> Martin
> ______________________________________________
>
>
>
>
> > From: mathieu.fenn...@replicon.com
> > Date: Wed, 17 Aug 2016 18:06:38 -0600
> > Subject: Re: DLL Hell
> > To: mgai...@hotmail.com
> >
> > Hi Martin,
> >
> > I'm sorry, this is way outside my Kafka knowledge. I'm just a new
> > Kafka user who wanted to help with your Windows questions because I
> > had just faced the same hurdle. :-) Wish I could help, but I wouldn't
> > know where to start with this.
> >
> > Mathieu
> >
> >
> > On Wed, Aug 17, 2016 at 6:00 PM, Martin Gainty <mgai...@hotmail.com>
> wrote:
> > > Hi Matthieu
> > > Many Thanks for attaching the binary
> > >
> > > running scala->java generator plugin I see:
> > >
> > > [ERROR]
> > > C:\Maven-plugin\kafka\kafka-trunk\core\src\main\scala\
> kafka\admin\AdminUtils.scala:639:
> > > error: type PartitionMetadata is not a member of object
> > > org.apache.kafka.common.requests.MetadataResponse
> > >
> > > yet when I look at org.apache.kafka.common.requests.MetadataResponse.java
> I
> > > see inner class
> > >
> > > public static class PartitionMetadata {
> > >
> > > inner static java classes are not visible to the converter for some
> reason
> > > the workaround seems to be birth inner static classes (e.g.
> > > PartitionMetadata)
> > > treating inner class as standalone works
> > >
> > > Advice?
> > > Martin
> > > ______________________________________________
> > >
> > >
> > >
> > >
> > > ________________________________
> > > From: mathieu.fenn...@replicon.com
> > > Date: Tue, 16 Aug 2016 08:04:52 -0600
> > > Subject: Re: DLL Hell
> > > To: mgai...@hotmail.com
> > >
> > >
> > > Hey Martin,
> > >
> > > Attached is the native .dll that I was able to build for rocksdb. If
> you
> > > unzip this, and include the contained .dll into your
> rocksdbjni-4.8.0.jar at
> > > the root, it should be possible to use Kafka Streams in Windows. But
> this
> > > is just a minimal debug build; wouldn't be appropriate for production
> use.
> > > Might save you some time if you're just trying to get a dev environment
> > > working though.
> > >
> > > Mathieu
> > >
> > >
> > > On Tue, Aug 16, 2016 at 7:40 AM, Martin Gainty <mgai...@hotmail.com>
> wrote:
> > >
> > >
> > >
> > >
> > >> From: mathieu.fenn...@replicon.com
> > >> Date: Tue, 16 Aug 2016 06:57:16 -0600
> > >> Subject: Re: DLL Hell
> > >> To: users@kafka.apache.org
> > >>
> > >> Hey Martin,
> > >>
> > >> I had to modify the -G argument to that command to include the visual
> > >> studio year. If you run "cmake /?", it will output all the available
> > >> generators. My cmake looked like:
> > >>
> > >> cmake -G "Visual Studio 12 2013 Win64" -DJNI=1 ..
> > >>
> > >> I think this is probably a change in cmake since the rocksdb doc was
> > >> written (
> > >>
> > >> https://cmake.org/cmake/help/v3.0/generator/Visual%
> 20Studio%2012%202013.html
> > >> ).
> > >> MG>same "informative error"
> > >>C:\cygwin64\bin\cmake -G "Visual Studio 12 2013 Win64" -DJNI=1
> > > CMake Error: Could not create named generator Visual Studio 12 2013
> Win64
> > > Generators Unix Makefiles = Generates standard UNIX
> > > makefiles. Ninja = Generates build.ninja files.
> > > CodeBlocks - Ninja = Generates CodeBlocks project files.
> > > CodeBlocks - Unix Makefiles = Generates CodeBlocks project files.
> CodeLite
> > > - Ninja = Generates CodeLite project files. CodeLite - Unix
> > > Makefiles = Generates CodeLite project files. Eclipse CDT4 - Ninja
> > > = Generates Eclipse CDT 4.0 project files. Eclipse CDT4 - Unix
> Makefiles=
> > > Generates Eclipse CDT 4.0 project files. KDevelop3 =
> > > Generates KDevelop 3 project files. KDevelop3 - Unix Makefiles =
> > > Generates KDevelop 3 project files. Kate - Ninja =
> > > Generates Kate project files. Kate - Unix Makefiles = Generates Kate
> > > project files. Sublime Text 2 - Ninja = Generates Sublime Text 2
> > > project files. Sublime Text 2 - Unix Makefiles
> > > = Generates Sublime Text 2 project files.
> > > MG>I am thinking if I want to automate this native build..I could more
> > > easily create binary thru maven-nar-plugin ?
> > > MG>as I do not have any MS VS or DotNet installed..maybe I need to
> install
> > > many gigs of MS specific VS?
> > > MG>Please advise
> > >> Mathieu
> > >>
> > >>
> > >> On Tue, Aug 16, 2016 at 5:03 AM, Martin Gainty <mgai...@hotmail.com>
> > >> wrote:
> > >>
> > >> > havent used cmake in over 10 years so Im a bit lost..
> > >> > cmake -G "Visual Studio 12 Win64" -DGFLAGS=1 -DSNAPPY=1 -DJEMALLOC=1
> > >> > -DJNI=1
> > >> > CMake Error: Could not create named generator Visual Studio 12 Win64
> > >> > ?Please advise
> > >> > Martin
> > >> > ______________________________________________
> > >> >
> > >> >
> > >> >
> > >> > > From: mathieu.fenn...@replicon.com
> > >> > > Date: Mon, 15 Aug 2016 13:43:47 -0600
> > >> > > Subject: Re: DLL Hell
> > >> > > To: users@kafka.apache.org
> > >> > >
> > >> > > Hi Martin,
> > >> > >
> > >> > > rocksdb does not currently distribute a Windows-compatible build
> of
> > >> > > their
> > >> > > rocksdbjni library. I recently wrote up some instructions on how
> to
> > >> > > produce a local build, which you can find here:
> > >> > > http://mail-archives.apache.org/mod_mbox/kafka-users/
> > >> > 201608.mbox/%3CCAHoiPjweo-xSj3TiodcDVf4wNnnJ8u6PcwWDPF7L
> > >> > T5ps%2BxQ3eA%40mail.gmail.com%3E
> > >> > >
> > >> > > I'd also suggest tracking this issue in GitHub, which is likely
> to be
> > >> > > updated if this ever changes: https://github.com/facebook/
> > >> > rocksdb/issues/703
> > >> > >
> > >> > > Mathieu
> > >> > >
> > >> > >
> > >> > > On Mon, Aug 15, 2016 at 1:34 PM, Martin Gainty <
> mgai...@hotmail.com>
> > >> > wrote:
> > >> > >
> > >> > > > kafka-trunk\streams>gradle buildCaused by:
> > >> > > > java.lang.RuntimeException:
> > >> > > > librocksdbjni-win64.dll was not found inside JAR. at
> > >> > org.rocksdb.
> > >> > > > NativeLibraryLoader.loadLibraryFromJarToTemp(
> > >> > NativeLibraryLoader.java:106)
> > >> > > > at org.rocksdb.NativeLibraryLoader.loadLibraryFromJar(
> > >> > NativeLibraryLoader.java:78)
> > >> > > > at org.rocksdb.NativeLibraryLoader.loadLibrary(
> > >> > NativeLibraryLoader.java:56)
> > >> > > > at org.rocksdb.RocksDB.loadLibrary(RocksDB.java:47) at
> > >> > > > org.rocksdb.RocksDB.<clinit>(RocksDB.java:23)
> > >> > > > any idea where I can locale librocksdbjni-win64.dll ?
> > >> > > > /thanks/
> > >> > > > Martin
> > >> > > > ______________________________________________
> > >> > > >
> > >> > > >
> > >> >
> > >> >
> > >
> > >
> > >
>

Reply via email to