Hello Kostas,

Here is the pom

<!--
Licensed to the Apache Software Foundation (ASF) under one
or more contributor license agreements.  See the NOTICE file
distributed with this work for additional information
regarding copyright ownership.  The ASF licenses this file
to you under the Apache License, Version 2.0 (the
"License"); you may not use this file except in compliance
with the License.  You may obtain a copy of the License at

  http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing,
software distributed under the License is distributed on an
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
KIND, either express or implied.  See the License for the
specific language governing permissions and limitations
under the License.
-->
<project xmlns="http://maven.apache.org/POM/4.0.0";
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance";
   xsi:schemaLocation="http://maven.apache.org/POM/4.0.0
http://maven.apache.org/xsd/maven-4.0.0.xsd";>
   <modelVersion>4.0.0</modelVersion>

   <groupId>org.myorg.quickstart</groupId>
   <artifactId>quickstart</artifactId>
   <version>0.1</version>
   <packaging>jar</packaging>

   <name>Flink Quickstart Job</name>
   <url>http://www.myorganization.org</url>

   <properties>
      <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
      <flink.version>1.0.3</flink.version>
   </properties>

   <repositories>
      <repository>
         <id>apache.snapshots</id>
         <name>Apache Development Snapshot Repository</name>
         
<url>https://repository.apache.org/content/repositories/snapshots/</url>
         <releases>
            <enabled>false</enabled>
         </releases>
         <snapshots>
            <enabled>true</enabled>
         </snapshots>
      </repository>
   </repositories>

   <!--

      Execute "mvn clean package -Pbuild-jar"
      to build a jar file out of this project!

      How to use the Flink Quickstart pom:

      a) Adding new dependencies:
         You can add dependencies to the list below.
         Please check if the maven-shade-plugin below is filtering out
your dependency
         and remove the exclude from there.

      b) Build a jar for running on the cluster:
         There are two options for creating a jar from this project

         b.1) "mvn clean package" -> this will create a fat jar which
contains all
               dependencies necessary for running the jar created by
this pom in a cluster.
               The "maven-shade-plugin" excludes everything that is
provided on a running Flink cluster.

         b.2) "mvn clean package -Pbuild-jar" -> This will also create
a fat-jar, but with much
               nicer dependency exclusion handling. This approach is
preferred and leads to
               much cleaner jar files.
   -->

   <dependencies>
      <dependency>
         <groupId>org.apache.flink</groupId>
         <artifactId>flink-java</artifactId>
         <version>${flink.version}</version>
      </dependency>
      <dependency>
         <groupId>org.opencv</groupId>
         <artifactId>opencv</artifactId>
         <version>3.1.0</version>
      </dependency>
      <dependency>
         <groupId>org.apache.flink</groupId>
         <artifactId>flink-streaming-java_2.10</artifactId>
         <version>${flink.version}</version>
      </dependency>
      <dependency>
         <groupId>org.apache.flink</groupId>
         <artifactId>flink-clients_2.10</artifactId>
         <version>${flink.version}</version>
      </dependency>
      <dependency>
         <groupId>org.apache.flink</groupId>
         <artifactId>flink-yarn-tests</artifactId>
         <version>0.10-SNAPSHOT</version>
      </dependency>
   </dependencies>

   <profiles>
      <profile>
         <!-- Profile for packaging correct JAR files -->
         <id>build-jar</id>
         <activation>
            <activeByDefault>false</activeByDefault>
         </activation>
         <dependencies>
            <dependency>
               <groupId>org.apache.flink</groupId>
               <artifactId>flink-java</artifactId>
               <version>${flink.version}</version>
               <scope>provided</scope>
            </dependency>
            <dependency>
               <groupId>org.apache.flink</groupId>
               <artifactId>flink-streaming-java_2.10</artifactId>
               <version>${flink.version}</version>
               <scope>provided</scope>
            </dependency>
            <dependency>
               <groupId>org.apache.flink</groupId>
               <artifactId>flink-clients_2.10</artifactId>
               <version>${flink.version}</version>
               <scope>provided</scope>
            </dependency>
         </dependencies>

         <build>
            <plugins>
               <!-- disable the exclusion rules -->
               <plugin>
                  <groupId>org.apache.maven.plugins</groupId>
                  <artifactId>maven-shade-plugin</artifactId>
                  <version>2.4.1</version>
                  <executions>
                     <execution>
                        <phase>package</phase>
                        <goals>
                           <goal>shade</goal>
                        </goals>
                        <configuration>
                           <artifactSet>
                              <excludes combine.self="override"></excludes>
                           </artifactSet>
                        </configuration>
                     </execution>
                  </executions>
               </plugin>
            </plugins>
         </build>
      </profile>
   </profiles>

   <build>
      <plugins>
         <!-- We use the maven-shade plugin to create a fat jar that
contains all dependencies
         except flink and it's transitive dependencies. The resulting
fat-jar can be executed
         on a cluster. Change the value of Program-Class if your
program entry point changes. -->
         <plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-shade-plugin</artifactId>
            <version>2.4.1</version>
            <executions>
               <!-- Run shade goal on package phase -->
               <execution>
                  <phase>package</phase>
                  <goals>
                     <goal>shade</goal>
                  </goals>
                  <configuration>
                     <artifactSet>
                        <excludes>
                           <!-- This list contains all dependencies of
flink-dist
                           Everything else will be packaged into the fat-jar
                           -->
                           <exclude>org.apache.flink:flink-annotations</exclude>

<exclude>org.apache.flink:flink-shaded-hadoop1</exclude>

<exclude>org.apache.flink:flink-shaded-hadoop2</exclude>

<exclude>org.apache.flink:flink-shaded-curator-recipes</exclude>
                           <exclude>org.apache.flink:flink-core</exclude>
                           <exclude>org.apache.flink:flink-java</exclude>
                           <exclude>org.apache.flink:flink-scala_2.10</exclude>

<exclude>org.apache.flink:flink-runtime_2.10</exclude>

<exclude>org.apache.flink:flink-optimizer_2.10</exclude>

<exclude>org.apache.flink:flink-clients_2.10</exclude>
                           <exclude>org.apache.flink:flink-avro_2.10</exclude>

<exclude>org.apache.flink:flink-examples-batch_2.10</exclude>

<exclude>org.apache.flink:flink-examples-streaming_2.10</exclude>

<exclude>org.apache.flink:flink-streaming-java_2.10</exclude>

                           <!-- Also exclude very big transitive
dependencies of Flink

                           WARNING: You have to remove these excludes
if your code relies on other
                           versions of these dependencies.

                           -->
                           <exclude>org.scala-lang:scala-library</exclude>
                           <exclude>org.scala-lang:scala-compiler</exclude>
                           <exclude>org.scala-lang:scala-reflect</exclude>
                           <exclude>com.amazonaws:aws-java-sdk</exclude>
                           <exclude>com.typesafe.akka:akka-actor_*</exclude>
                           <exclude>com.typesafe.akka:akka-remote_*</exclude>
                           <exclude>com.typesafe.akka:akka-slf4j_*</exclude>
                           <exclude>io.netty:netty-all</exclude>
                           <exclude>io.netty:netty</exclude>

<exclude>commons-fileupload:commons-fileupload</exclude>
                           <exclude>org.apache.avro:avro</exclude>

<exclude>commons-collections:commons-collections</exclude>

<exclude>org.codehaus.jackson:jackson-core-asl</exclude>

<exclude>org.codehaus.jackson:jackson-mapper-asl</exclude>

<exclude>com.thoughtworks.paranamer:paranamer</exclude>
                           <exclude>org.xerial.snappy:snappy-java</exclude>

<exclude>org.apache.commons:commons-compress</exclude>
                           <exclude>org.tukaani:xz</exclude>
                           <exclude>com.esotericsoftware.kryo:kryo</exclude>
                           <exclude>com.esotericsoftware.minlog:minlog</exclude>
                           <exclude>org.objenesis:objenesis</exclude>
                           <exclude>com.twitter:chill_*</exclude>
                           <exclude>com.twitter:chill-java</exclude>
                           <exclude>com.twitter:chill-avro_*</exclude>
                           <exclude>com.twitter:chill-bijection_*</exclude>
                           <exclude>com.twitter:bijection-core_*</exclude>
                           <exclude>com.twitter:bijection-avro_*</exclude>
                           <exclude>commons-lang:commons-lang</exclude>
                           <exclude>junit:junit</exclude>
                           <exclude>de.javakaffee:kryo-serializers</exclude>
                           <exclude>joda-time:joda-time</exclude>
                           <exclude>org.apache.commons:commons-lang3</exclude>
                           <exclude>org.slf4j:slf4j-api</exclude>
                           <exclude>org.slf4j:slf4j-log4j12</exclude>
                           <exclude>log4j:log4j</exclude>
                           <exclude>org.apache.commons:commons-math</exclude>

<exclude>org.apache.sling:org.apache.sling.commons.json</exclude>
                           <exclude>commons-logging:commons-logging</exclude>
                           <exclude>commons-codec:commons-codec</exclude>

<exclude>com.fasterxml.jackson.core:jackson-core</exclude>

<exclude>com.fasterxml.jackson.core:jackson-databind</exclude>

<exclude>com.fasterxml.jackson.core:jackson-annotations</exclude>
                           <exclude>stax:stax-api</exclude>
                           <exclude>com.typesafe:config</exclude>

<exclude>org.uncommons.maths:uncommons-maths</exclude>
                           <exclude>com.github.scopt:scopt_*</exclude>
                           <exclude>commons-io:commons-io</exclude>
                           <exclude>commons-cli:commons-cli</exclude>
                        </excludes>
                     </artifactSet>
                     <filters>
                        <filter>
                           <artifact>org.apache.flink:*</artifact>
                           <excludes>
                              <!-- exclude shaded google but include
shaded curator -->
                              <exclude>org/apache/flink/shaded/com/**</exclude>
                              <exclude>web-docs/**</exclude>
                           </excludes>
                        </filter>
                        <filter>
                           <!-- Do not copy the signatures in the
META-INF folder.
                           Otherwise, this might cause
SecurityExceptions when using the JAR. -->
                           <artifact>*:*</artifact>
                           <excludes>
                              <exclude>META-INF/*.SF</exclude>
                              <exclude>META-INF/*.DSA</exclude>
                              <exclude>META-INF/*.RSA</exclude>
                           </excludes>
                        </filter>
                     </filters>
                     <transformers>
                        <!-- add Main-Class to manifest file -->
                        <transformer
implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
                           <mainClass>org.myorg.quickstart.Job</mainClass>
                        </transformer>
                     </transformers>

<createDependencyReducedPom>false</createDependencyReducedPom>
                  </configuration>
               </execution>
            </executions>
         </plugin>

         <plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-compiler-plugin</artifactId>
            <version>3.1</version>
            <configuration>
               <source>1.7</source> <!-- If you want to use Java 8,
change this to "1.8" -->
               <target>1.7</target> <!-- If you want to use Java 8,
change this to "1.8" -->
            </configuration>
         </plugin>
      </plugins>


      <!-- If you want to use Java 8 Lambda Expressions uncomment the
following lines -->
      <!--
      <pluginManagement>
         <plugins>
            <plugin>
               <artifactId>maven-compiler-plugin</artifactId>
               <configuration>
                  <source>1.8</source>
                  <target>1.8</target>
                  <compilerId>jdt</compilerId>
               </configuration>
               <dependencies>
                  <dependency>
                     <groupId>org.eclipse.tycho</groupId>
                     <artifactId>tycho-compiler-jdt</artifactId>
                     <version>0.21.0</version>
                  </dependency>
               </dependencies>
            </plugin>

            <plugin>
               <groupId>org.eclipse.m2e</groupId>
               <artifactId>lifecycle-mapping</artifactId>
               <version>1.0.0</version>
               <configuration>
                  <lifecycleMappingMetadata>
                     <pluginExecutions>
                        <pluginExecution>
                           <pluginExecutionFilter>
                              <groupId>org.apache.maven.plugins</groupId>
                              <artifactId>maven-assembly-plugin</artifactId>
                              <versionRange>[2.4,)</versionRange>
                              <goals>
                                 <goal>single</goal>
                              </goals>
                           </pluginExecutionFilter>
                           <action>
                              <ignore/>
                           </action>
                        </pluginExecution>
                        <pluginExecution>
                           <pluginExecutionFilter>
                              <groupId>org.apache.maven.plugins</groupId>
                              <artifactId>maven-compiler-plugin</artifactId>
                              <versionRange>[3.1,)</versionRange>
                              <goals>
                                 <goal>testCompile</goal>
                                 <goal>compile</goal>
                              </goals>
                           </pluginExecutionFilter>
                           <action>
                              <ignore/>
                           </action>
                        </pluginExecution>
                     </pluginExecutions>
                  </lifecycleMappingMetadata>
               </configuration>
            </plugin>
         </plugins>
      </pluginManagement>
      -->

   </build>
</project>

and currently I am using flink version 1.0.3. Since yesterday I had 1.0.2 ,
however I removed that version today morning and downloaded the version
1.0.3.

Warm Regards,
Debaditya

On Thu, Jun 2, 2016 at 3:07 PM, Kostas Kloudas <k.klou...@data-artisans.com>
wrote:

> Could you please post your pom file and which version of Flink you
> downloaded?
> The latter you can find it in the beginning of the log files.
>
> Kostas
>
> On Jun 2, 2016, at 2:55 PM, Debaditya Roy <roydca...@gmail.com> wrote:
>
> Hi Kostas,
>
> I followed this
>
> https://ci.apache.org/projects/flink/flink-docs-release-1.0/quickstart/java_api_quickstart.html
> .
> Just for your information, in IDE the program is running fine, it is only
> in the command line when I try to submit it throws me the error.
>
> Warm Regards,
> Debaditya
>
> On Thu, Jun 2, 2016 at 2:17 PM, Kostas Kloudas <
> k.klou...@data-artisans.com> wrote:
>
>> Hi Debaditya,
>>
>> When creating your application, did you follow the steps described in:
>>
>> https://ci.apache.org/projects/flink/flink-docs-release-1.0/apis/local_execution.html
>> (of course adjusted to the release you are currently using) ?
>>
>> Kostas
>>
>> On Jun 2, 2016, at 2:01 PM, Debaditya Roy <roydca...@gmail.com> wrote:
>>
>> Hello Kostas,
>>
>> Thanks for you reply.
>>
>> I don't think that is the case because I am running the application on
>> the jvm of my local machine, not on a distributed cluster. Any other input?
>>
>> Warm Regards,
>> Debaditya
>>
>>
>>
>> On Thu, Jun 2, 2016 at 1:27 PM, Kostas Kloudas <
>> k.klou...@data-artisans.com> wrote:
>>
>>> Hello Debaditya,
>>>
>>> From the exception message you posted it seems that it is a linkage
>>> error.
>>>
>>> Could it be that you are combining different versions of Flink when
>>> running your application?
>>> E.g. you have version X running on your cluster and you create your jar
>>> against version Y on your local machine?
>>> In this case, Flink version Y may have a method that is no longer there
>>> in version X, and this causes the
>>> exception.
>>>
>>> Kostas
>>>
>>> On Jun 2, 2016, at 12:02 PM, Debaditya Roy <roydca...@gmail.com> wrote:
>>>
>>> Hi,
>>>
>>> I am trying to run a simple flink program, which reads an image from
>>> disk and does some image processing and stores it back to the disk. For the
>>> purpose I have a custom defined class, which I am using in dataset and
>>> passing it onwards to flatmap function. However the experiment encountered
>>> an error which I am clueless about. Any help will be highly appreciated.
>>> Pasting the error below.
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>> *Exception in thread "main" java.lang.NoSuchMethodError:
>>> org.apache.flink.api.java.typeutils.runtime.kryo.Serializers.recursivelyRegisterType(Ljava/lang/Class;Lorg/apache/flink/api/common/ExecutionConfig;)V
>>> at
>>> org.apache.flink.api.java.ExecutionEnvironment$1.preVisit(ExecutionEnvironment.java:985)
>>> at
>>> org.apache.flink.api.java.ExecutionEnvironment$1.preVisit(ExecutionEnvironment.java:977)
>>> at
>>> org.apache.flink.api.common.operators.SingleInputOperator.accept(SingleInputOperator.java:198)
>>> at
>>> org.apache.flink.api.common.operators.GenericDataSinkBase.accept(GenericDataSinkBase.java:223)
>>> at org.apache.flink.api.common.Plan.accept(Plan.java:348)    at
>>> org.apache.flink.api.java.ExecutionEnvironment.createProgramPlan(ExecutionEnvironment.java:977)
>>> at
>>> org.apache.flink.api.java.ExecutionEnvironment.createProgramPlan(ExecutionEnvironment.java:938)
>>> at
>>> org.apache.flink.api.java.LocalEnvironment.execute(LocalEnvironment.java:80)
>>> at
>>> org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:834)
>>> at org.apache.flink.api.java.DataSet.collect(DataSet.java:408)    at
>>> org.myorg.quickstart.Job.main(Job.java:51)*
>>>
>>> Warm Regards,
>>> Debaditya
>>>
>>>
>>>
>>
>>
>
>

Reply via email to