Hi Hanan,
Could you please, by any chance, run your program in a local cluster with
your dependencies in the lib folder? You can use "./bin/start-local.sh" and
try submitting your program to localhost. That would help us to find out if
it is a YARN issue.
Thanks,
Max
On Fri, Sep 25, 2015 at 8:39
Hi
I rechecked that I put all my Jars in the Lib folder .
I have also noticed that it fails while loading my first Pojo class .
I start the cluster via Yarn using Flink 0.9.1 .
Thanks ,
Mr Hanan Meyer
On Thu, Sep 24, 2015 at 6:08 PM, Stephan Ewen wrote:
> My first guess would be that you di
My first guess would be that you did not put all jars into the lib folder.
To help us understand this, do you start the cluster manually, or via YARN?
On Thu, Sep 24, 2015 at 4:59 PM, Hanan Meyer wrote:
> Hi
> Thanks for the fast response
> I Have tried the walk-around by excluding the Jars fro
Hi
Thanks for the fast response
I Have tried the walk-around by excluding the Jars from the
RemoteEnvironment's init line :
ExecutionEnvironment env =
ExecutionEnvironment.createRemoteEnvironment(FLINK_URL, FLINK_PORT);
instrad of :
ExecutionEnvironment env =
ExecutionEnvironment.createRemoteEnviro
Hi Hanan,
you're right that currently every time you submit a job to the Flink
cluster, all user code jars are uploaded and overwrite possibly existing
files. This is not really necessary if they don't change. Maybe we should
add a check that already existing files on the JobManager are not upload
I think there is not yet any mechanism, but it would be a good addition, I
agree.
Between JobManager and TaskManagers, the JARs are cached. The TaskManagers
receive hashes of the JARs only, and only load them if they do not already
have them. The same mechanism should be used for the Client to upl
Hello All
I use Flink in order to filter data from Hdfs and write it back as CSV.
I keep getting the "Checking and uploading JAR files" on every DataSet
filtering action or
executionEnvironment execution.
I use ExecutionEnvironment.createRemoteEnvironment(ip+jars..) because I
launch Flink from
a