Hi, I have some questions:
1. How can you manage multiple jars (jobs) easily using Flink? 2. All jobs should run on the same task manager or do we need to use one for each job? 3. Can we store the jars in some persistent storage (such as S3) and start a job for each jar from that storage? 4. Also, how we can upgrade a jar (job)? Do we need to stop all jobs and update the task manager for that? Or if using that persistent storage for jars, it's enough to update the jar in the storage and to restart just the corresponding job? For now, we have flink deployed in Kubernetes, but with only one job running. We upgrade it by stopping the job, upgrading the task manager and job manager with the new docker image containing the updated jar, and then we start the job again with the latest savepoint/checkpoint. Thank you, Alex