Thank you for your quick response.
I just do not understand quite what you mean. I need to define a method in
Java or do you mean that I should use the monitoring in the browser?
2015-06-26 10:09 GMT+02:00 Robert Metzger :
> Hi,
>
> The TaskManager which is running the Sync task is logging when i
Right!
later I will do the question and quoting your answer with the solution :)
Il giorno 26/giu/2015, alle ore 12:27, Stephan Ewen
mailto:se...@apache.org>> ha scritto:
Seems like a good idea to collect these questions.
Stackoverflow is also a good place for "useful tricks"...
On Fri, Jun 26
Yes it does!
Every time I find this kind of pages that explain internals they come out to be
really useful
Il giorno 24/giu/2015, alle ore 18:35, Stephan Ewen
mailto:se...@apache.org>> ha scritto:
Hi Michele!
This may help explain things:
https://cwiki.apache.org/confluence/display/FLINK/Va
Seems like a good idea to collect these questions.
Stackoverflow is also a good place for "useful tricks"...
On Fri, Jun 26, 2015 at 12:25 PM, Michele Bertoni <
michele1.bert...@mail.polimi.it> wrote:
> Got it!
> i will try thanks! :)
>
> What about writing a section of it in the programming g
Got it!
i will try thanks! :)
What about writing a section of it in the programming guide?
I found a couple of topic about the readers in the mailing list, it seems it
may be helpful
Il giorno 26/giu/2015, alle ore 12:21, Stephan Ewen
mailto:se...@apache.org>> ha scritto:
Sure, just override
Sure, just override the "createInputSplits()" method. Call for each of your
file paths "super.createInputSplits()" and then combine the results into
one array that you return.
That should do it...
On Fri, Jun 26, 2015 at 12:19 PM, Michele Bertoni <
michele1.bert...@mail.polimi.it> wrote:
> Hi S
Hi Stephan, thanks for answering,
right now I am using an extension of the DelimitedInputFormat, is there a way
to merge it with the option 2?
Il giorno 26/giu/2015, alle ore 12:17, Stephan Ewen
mailto:se...@apache.org>> ha scritto:
There are two ways you can realize that:
1) Create multiple
There are two ways you can realize that:
1) Create multiple sources and union them. This is easy, but probably a bit
less efficient.
2) Override the FileInputFormat's createInputSplits method to take a union
of the paths to create a list of all files and fils splits that will be
read.
Stephan
Hi everybody,
is there a way to specify a list of URI (“hdfs://file1”,”hdfs://file2”,…) and
open them as different files?
I know i may open the entire directory, but i want to be able to select a
subset of files in the directory
thanks
Hi Stefan,
You can do this if you disableSysoutLogging and change your
log4j-cli.properties file to also print to console. There you can then
control what is logged to console. However, I think that you have to set
disableSysoutLogging in your program.
Cheers,
Till
On Fri, Jun 26, 2015 at 11:4
There are two different forms of progress logging:
- The log4j framework logs everything (as configured in the
log4j.properties)
- Additionally, the client sysout prints the progress reports
If you want the command line to be quiet, you need to disable sysout
logging on the execution environmen
Hi Robert,
this problem persists in the 0.9 release. Using `disableSysoutLogging()`
works, but I'd rather configure this in the log4j.xml. Is this possible?
Best,
Stefan
On 14 April 2015 at 20:55, Robert Metzger wrote:
> You can control the logging behavior from the ExecutionConfig
> (env.getE
Hi Vasia,
/InitVerticesMapper/ is called in the run method of APSP:
/
//@Override//
//public Graph, NullValue> run(GraphTuple2, NullValue> input) {//
//
//VertexCentricConfiguration parameters = new
VertexCentricConfiguration();//
//parameters.setSolutionSetUnmanagedMem
Hi Mihail,
could you share your code or at least the implementations of
getVerticesDataSet() and InitVerticesMapper so I can take a look?
Where is InitVerticesMapper called above?
Cheers,
Vasia.
On 26 June 2015 at 10:51, Mihail Vieru
wrote:
> Hi Robert,
>
> I'm using the same input data, as
Hi Robert,
I'm using the same input data, as well as the same parameters I use in
the IDE's run configuration.
I don't run the job on the cluster (yet), but locally, by starting Flink
with the start-local.sh script.
I will try to explain my code a bit. The /Integer[] /array is
initialized i
Looks like an exception in one of the Gelly functions.
Let's wait for someone from Gelly to jump in...
On Thu, Jun 25, 2015 at 7:41 PM, Mihail Vieru wrote:
> Hi,
>
> I get an ArrayIndexOutOfBoundsException when I run my job from a JAR in
> the CLI.
> This doesn't occur in the IDE.
>
> I've bui
Hi,
The TaskManager which is running the Sync task is logging when its starting
the next iteration. I know its not very convenient.
You can also log the time and Iteration id (from the
IterationRuntimeContext) in the open() method.
On Fri, Jun 26, 2015 at 9:57 AM, Pa Rö
wrote:
> hello flink com
Not yet, no. I created a Jira issue:
https://issues.apache.org/jira/browse/FLINK-2277
On Thu, 25 Jun 2015 at 14:48 Sebastian wrote:
> Is there a way to configure this setting for a delta iteration in the
> scala API?
>
> Best,
> Sebastian
>
> On 17.06.2015 10:04, Ufuk Celebi wrote:
> >
> > On 17
Hi Mihail,
the NPE has been thrown from
*graphdistance.APSP$InitVerticesMapper.map(APSP.java:74)*. I guess that is
code written by you or a library you are using.
Maybe the data you are using on the cluster is different from your local
test data?
Best,
Robert
On Thu, Jun 25, 2015 at 7:41 PM, Mi
hello flink community,
i have write a k means app for clustering temporal geo data. now i want
know how many time flink need for compute one iteration. Is it possible to
measure that, cause of the execution engine of flink?
best regards,
paul
Hi Aljoscha
You are the best.
Thank you very much.
Right now, It is working now.
Best regards
Hawin
On Fri, Jun 26, 2015 at 12:28 AM, Aljoscha Krettek
wrote:
> Hi,
> could you please try replacing JavaDefaultStringSchema() with
> SimpleStringSchema() in your first example. The one where you
Hi,
could you please try replacing JavaDefaultStringSchema() with
SimpleStringSchema() in your first example. The one where you get this
exception:
org.apache.commons.lang3.SerializationException:
java.io.StreamCorruptedException: invalid stream header: 68617769
Cheers,
Aljoscha
On Fri, 26 Jun 20
22 matches
Mail list logo