Hi Mich, You can add external dependencies to Scala shell using `--addclasspath` option. There is more detail description in documentation [1].
[1]: https://ci.apache.org/projects/flink/flink-docs-release-1.0/apis/scala_shell.html#adding-external-dependencies Regards, Chiwan Park > On Apr 17, 2016, at 6:04 PM, Mich Talebzadeh <mich.talebza...@gmail.com> > wrote: > > Hi, > > IN Spark shell I can load Kafka jar file through spark-shell option --jar > > spark-shell --master spark://50.140.197.217:7077 --jars > ,/home/hduser/jars/spark-streaming-kafka-assembly_2.10-1.6.1.jar > > This works fine. > > In Flink I have added the jar file > /home/hduser/jars/flink-connector-kafka-0.10.1.jar to the CLASSPATH. > > However I don't get any support for it within flink shell > > Scala-Flink> import org.apache.flink.streaming.connectors.kafka > <console>:54: error: object connectors is not a member of package > org.apache.flink.streaming > import org.apache.flink.streaming.connectors.kafka > > > Any ideas will be appreciated > ^ > > Dr Mich Talebzadeh > > LinkedIn > https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw > > http://talebzadehmich.wordpress.com >