he JobGraph file is still accessible?
>
> Cheers,
> Till
>
> On Wed, May 8, 2019 at 11:22 AM Manjusha Vuyyuru
> wrote:
>
>> Any update on this from community side?
>>
>> On Tue, May 7, 2019 at 6:43 PM Manjusha Vuyyuru
>> wrote:
>>
>>> im
Any update on this from community side?
On Tue, May 7, 2019 at 6:43 PM Manjusha Vuyyuru
wrote:
> im using 1.7.2.
>
>
> On Tue, May 7, 2019 at 5:50 PM miki haiat wrote:
>
>> Which flink version are you using?
>> I had similar issues with 1.5.x
>>
>>
im using 1.7.2.
On Tue, May 7, 2019 at 5:50 PM miki haiat wrote:
> Which flink version are you using?
> I had similar issues with 1.5.x
>
> On Tue, May 7, 2019 at 2:49 PM Manjusha Vuyyuru
> wrote:
>
>> Hello,
>>
>> I have a flink setup with two job manage
Hello,
I have a flink setup with two job managers coordinated by zookeeper.
I see the below exception and both jobmanagers are going down:
2019-05-07 08:29:13,346 INFO
org.apache.flink.runtime.jobmanager.ZooKeeperSubmittedJobGraphStore -
Released locks of job graph f8eb1b482d8ec8c1d3e94c4d0f79d
x27;s enuf of that running in production and don't expect any of
> that yet to be part of flink codebase.
>
> On Thu, Apr 18, 2019 at 10:01 AM Manjusha Vuyyuru
> wrote:
>
>> Hello,
>>
>> Do flink have any plans to support Deep Learning, in near future?
>>
>> Thanks,
>> Manju
>>
>>
Hello,
Do flink have any plans to support Deep Learning, in near future?
Thanks,
Manju
Hello,
Can someone please explain me the functionality of blob server in flink ?
Thanks,
Manju
But 'JDBCInputFormat' will exit once its done reading all data.I need
something like which keeps polling to mysql and fetch if there are any
updates or changes.
Thanks,
manju
On Wed, Jan 23, 2019 at 7:10 AM Zhenghua Gao wrote:
> Actually flink-connectors/flink-jdbc module provided a JDBCInputFo
Hello,
Do flink 1.7.1 supports connection to relational database(mysql)?
I want to use mysql as my streaming source to read some configuration..
Thanks,
Manju
Hi Kostas,
I have a similar scenario where i have to clear window elements upon
reaching some count or clear the elements if they are older than one hour.
I'm using the below approach, just wanted to know if its the right way :
DataStream> out = mappedFields
.map(new CustomMapFunction())
and on
> the JobManager, they are only accessed during deployments, so that falls
> under this cleanup detection.
> A solution is to change the BLOB storage directory.
>
>
> Nico
>
> [1]
>
> https://data-artisans.com/flink-forward-berlin/resources/our-successful-journey
#x27;ed) might be able to
> help you.
>
> Best,
>
> Dawid
> On 23/10/2018 06:58, Manjusha Vuyyuru wrote:
>
> Hello All,
>
> I have a job which fails lets say after every 14 days with IO Exception,
> failed to fetch blob.
> I submitted the job using command line u
Hello All,
I have a job which fails lets say after every 14 days with IO Exception,
failed to fetch blob.
I submitted the job using command line using java jar.Below is the
exception I'm getting:
java.io.IOException: Failed to fetch BLOB
d23d168655dd51efe4764f9b22b85a18/p-446f7e0137fd66af062de7a
13 matches
Mail list logo