Hi Dian,

Thanks lot for your pointer!

I tried `-pyreq` and it worked as expected in general. But another problem 
arises when I tried to specified more dependencies in the requirements.txt. 

The taskmanagers are often lost(unreachable from jobmanager) during the process 
of pip install. 

I wonder if the installation affects the heartbeats between taskmanager and 
jobmanager?

Best,
Paul Lam

> 2021年12月23日 19:16,Dian Fu <dian0511...@gmail.com> 写道:
> 
> Hi Paul,
> 
> Currently, you need to build venv in an environment where you want to execute 
> the PyFlink jobs.
> 
> >> Also, I wonder if it’s possible for pyflink to optionally provide an 
> >> automatically created venv for each pyflink job?
> Do you mean to create the venv during executing the job? If this is your 
> requirement, maybe you could try to specify a requirements.txt file [1]. It 
> will prepare the Python environment according to the requirements.txt during 
> executing the job.
> 
> Regards,
> Dian
> 
> [1] 
> https://nightlies.apache.org/flink/flink-docs-release-1.14/docs/dev/python/dependency_management/#requirementstxt
>  
> <https://nightlies.apache.org/flink/flink-docs-release-1.14/docs/dev/python/dependency_management/#requirementstxt>
>  
> 
> On Thu, Dec 23, 2021 at 5:15 PM Paul Lam <paullin3...@gmail.com 
> <mailto:paullin3...@gmail.com>> wrote:
> Hi,
> 
> The document says we could use `-pyarch` to upload python venv, but I found 
> this is often problematic because users may not have the python binaries that 
> fits the flink runtime environment. 
> 
> For example, a user may upload a venv for macOS within the project, but the 
> Flink cluster is using Debian, causing pyflink to fail.
> 
> Is there some good practice to avoid this? Also, I wonder if it’s possible 
> for pyflink to optionally provide an automatically created venv for each 
> pyflink job?
> 
> Thanks!
> 
> Best,
> Paul Lam
> 

Reply via email to