I know exactly what to visualize. As I wrote, it is the latest result of the
Flink job I would like to visualize. There is no need to use elastic to find it
first.
The data I have is of such a nature that they every 10 seconds could be
written into a file, meaning that the file at all times w
Hi,
I would like to know how to write a Matrix or Vector (Dense/Sparse) to file?
Thanks in advance!
Best regards,
Lydia
Hi all,
I have two questions regarding sparse matrices:
1. I have a sparse Matrix: val sparseMatrix = SparseMatrix.fromCOO(row, col,
csvInput.collect())
and now I would like to extract all values that are in a specific row X. How
would I tackle that? flatMap() and filter() do not seem to be sup
Hi,
I have a source where I am using the collectWithTimestamp method and
streaming the timestamp along with the actual data.
Now I want to get the values of the timestamp in the map function, I tried
looking for that in the documentation in the following link and the
ExtractTimestamp method inter
> I am thinking it may not be the best fit, because Elastic is by nature a
> search engine that is good for trending and stuff like that - not entire
> replacement of the current view.
Why u think that the elasticsearch is not the right tool? To visualise
something u have to find what to visuali
Ah yes, if you used a local filesystem for backups this certainly was the
source of the problem.
On Sun, 29 May 2016 at 17:57 arpit srivastava wrote:
> I think the problem was that i was using local filesystem in a cluster.
> Now I have switched to hdfs.
>
> Thanks,
> Arpit
>
> On Sun, May 29, 2
Hi there
I am using Flink to analyse a lot of incoming data. Every 10 seconds it makes
sense to present the analysis so far as some form of visualization. Every 10
seconds I therefore will replace the current contents of the
visualization/presentation with the analysis result of the most recent
I think the problem was that i was using local filesystem in a cluster. Now
I have switched to hdfs.
Thanks,
Arpit
On Sun, May 29, 2016 at 12:57 PM, Aljoscha Krettek
wrote:
> Hi,
> could you please provide the code of your user function that has the
> Checkpointed interface and is keeping state
Hi again,
from your diagram I have put together a gist, which I think does the
job. I haven't had the time to test it though :(
https://gist.github.com/knaufk/d1312503b99ee51554a70c9a22abe7e5
If you have any questions, let me know. It sometimes just takes a while
until I answer ;)
Cheers,
Kons
Hi everyone,
In a step-function (bulk) I'd like to join the working set W
with another data set T. The join field of T depends on
the current super step. Unfortunately, W has no access
to the iteration runtime context.
I tried to extract the current superstep at the beginning of
the step functio
Aah, thanks a lot for that insight. Pretty new to the Flink systems and
learning on my own so prone to making mistakes.
Thanks a lot for helping.
--
View this message in context:
http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/Reading-Parameter-values-sent-to-partition-tp7
Hi,
could you please provide the code of your user function that has the
Checkpointed interface and is keeping state? This might give people a
chance of understanding what is going on.
Cheers,
Aljoscha
On Sat, 28 May 2016 at 20:55 arpit srivastava wrote:
> Hi,
>
> I am using Flink on yarn clust
Hi,
using DataStreamUtils.collect() in a test is difficult due to
synchronization problems, as you discovered yourself.
What I propose is to write a custom Sink that collects data and verifies
the results. Verification should both happen in the invoke() method and in
close(). For the sink, you sho
13 matches
Mail list logo