Yeap that was it (deploying 1.8.1 over 1.8.0 ) thanks !!!

On Mon, Aug 5, 2019 at 5:53 PM Gaël Renoux <gael.ren...@datadome.co> wrote:

> *This Message originated outside your organization.*
> ------------------------------
> Hi Avi and Victor,
>
> I just opened this ticket on JIRA:
> https://issues.apache.org/jira/browse/FLINK-13586
> <https://issues.apache.org/jira/browse/FLINK-13586>
> (I hadn't seen these e-mails). Backward compatibility is broken between
> 1.8.0 and 1.8.1 if you use Kafka connectors.
>
> Can you upgrade your flink-connector-kafka dependency to 1.8.1 ? It won't
> deploy on a 1.8.0 server any more, if that's a concern for you.
>
> Gaël
>
> On Mon, Aug 5, 2019 at 4:37 PM Wong Victor <jiasheng.w...@outlook.com>
> wrote:
>
>> Hi Avi:
>>
>>
>>
>> It seems you are submitting your job with an older Flink version (< 1.8),
>> please check your flink-dist version.
>>
>>
>>
>> Regards,
>>
>> Victor
>>
>>
>>
>> *From: *Avi Levi <avi.l...@bluevoyant.com>
>> *Date: *Monday, August 5, 2019 at 9:11 PM
>> *To: *user <user@flink.apache.org>
>> *Subject: *getting an exception
>>
>>
>>
>> Hi,
>>
>> I'm using Flink 1.8.1. our code is mostly using Scala.
>>
>> When I try to submit my job (on my local machine ) it crashes with the
>> error below (BTW on the IDE it runs perfectly).
>>
>> Any assistance would be appreciated.
>>
>> Thanks
>>
>> Avi
>>
>> 2019-08-05 12:58:03.783 [Flink-DispatcherRestEndpoint-thread-3] ERROR 
>> org.apache.flink.runtime.webmonitor.handlers.JarRunHandler  - Unhandled 
>> exception.
>>
>> org.apache.flink.client.program.ProgramInvocationException: The program 
>> caused an error:
>>
>>         at 
>> org.apache.flink.client.program.OptimizerPlanEnvironment.getOptimizedPlan(OptimizerPlanEnvironment.java:93)
>>
>>         at 
>> org.apache.flink.client.program.PackagedProgramUtils.createJobGraph(PackagedProgramUtils.java:80)
>>
>>         at 
>> org.apache.flink.runtime.webmonitor.handlers.utils.JarHandlerUtils$JarHandlerContext.toJobGraph(JarHandlerUtils.java:126)
>>
>>         at 
>> org.apache.flink.runtime.webmonitor.handlers.JarRunHandler.lambda$getJobGraphAsync$6(JarRunHandler.java:142)
>>
>>         at 
>> java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1590)
>>
>>         at 
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
>>
>>         at 
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>>
>>         at java.lang.Thread.run(Thread.java:748)
>>
>> Caused by: java.lang.NoSuchMethodError: 
>> org.apache.flink.api.java.ClosureCleaner.clean(Ljava/lang/Object;Lorg/apache/flink/api/common/ExecutionConfig$ClosureCleanerLevel;Z)V
>>
>>         at 
>> org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer011.<init>(http://FlinkKafkaProducer011.java:494
>>  <http://FlinkKafkaProducer011.java:494>)
>>
>>         at 
>> org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer011.<init>(http://FlinkKafkaProducer011.java:448
>>  <http://FlinkKafkaProducer011.java:448>)
>>
>>         at 
>> org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer011.<init>(http://FlinkKafkaProducer011.java:383
>>  <http://FlinkKafkaProducer011.java:383>)
>>
>>         at 
>> com.bluevoyant.commons.queueHandlers.KeyedKafkaProducerImpl.producer(KafkaImpl.scala:18)
>>
>>         at 
>> com.bluevoyant.commons.queueHandlers.KeyedKafkaProducerImpl.producer$(KafkaImpl.scala:18)
>>
>>         at 
>> com.bluevoyant.lookalike.analytic.queueHandlers.QueueHandlerImpl$.producer(QueueHandlerImpl.scala:13)
>>
>>         at 
>> com.bluevoyant.lookalike.analytic.StreamingJob$.delayedEndpoint$com$bluevoyant$lookalike$analytic$StreamingJob$1(StreamingJob.scala:42)
>>
>>         at 
>> com.bluevoyant.lookalike.analytic.StreamingJob$delayedInit$body.apply(StreamingJob.scala:14)
>>
>>         at scala.Function0.apply$mcV$sp(Function0.scala:34)
>>
>>         at scala.Function0.apply$mcV$sp$(Function0.scala:34)
>>
>>         at 
>> scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
>>
>>         at scala.App.$anonfun$main$1$adapted(App.scala:76)
>>
>>         at scala.collection.immutable.List.foreach(List.scala:388)
>>
>>         at scala.App.main(App.scala:76)
>>
>>         at scala.App.main$(App.scala:74)
>>
>>         at 
>> com.bluevoyant.lookalike.analytic.StreamingJob$.main(StreamingJob.scala:14)
>>
>>         at 
>> com.bluevoyant.lookalike.analytic.StreamingJob.main(StreamingJob.scala)
>>
>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>
>>         at 
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>>
>>         at 
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>
>>         at java.lang.reflect.Method.invoke(Method.java:498)
>>
>>         at 
>> org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:529)
>>
>>         at 
>> org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:421)
>>
>>         at 
>> org.apache.flink.client.program.OptimizerPlanEnvironment.getOptimizedPlan(OptimizerPlanEnvironment.java:83)
>>
>>         ... 7 common frames omitted
>>
>>
>>
>
>
> --
> Gaël Renoux
> Senior R&D Engineer, DataDome
> M +33 6 76 89 16 52  <+33+6+76+89+16+52>
> E gael.ren...@datadome.co  <gael.ren...@datadome.co>
> W www.datadome.co
> <http://www.datadome.co?utm_source=WiseStamp&utm_medium=email&utm_term=&utm_content=&utm_campaign=signature>
>
>
> <https://www.facebook.com/datadome/?utm_source=WiseStamp&utm_medium=email&utm_term=&utm_content=&utm_campaign=signature>
> <https://fr.linkedin.com/company/datadome?utm_source=WiseStamp&utm_medium=email&utm_term=&utm_content=&utm_campaign=signature>
> <https://twitter.com/data_dome?utm_source=WiseStamp&utm_medium=email&utm_term=&utm_content=&utm_campaign=signature>
> [image: Read DataDome reviews on G2]
> <https://www.g2.com/products/datadome/reviews?utm_source=review-widget&utm_source=WiseStamp&utm_medium=email&utm_term=&utm_content=&utm_campaign=signature>
>

Reply via email to