Hi Flink Team,
Im trying to implement app on pyflink.
I would like to structure the directory as follows:
flink_app/
data_service/
s3.py
filesystem.py
validator/
validator.py
metrics/
statictic.py
quality.py
common/
constants.py
main.py <- entry job
Two questions:
1) is it possible import constants from common in the data_service package? In clean python we can use an absolute path like "from flink_app.common import constants".
All files imported to flink "
env = StreamExecutionEnvironment.get_execution_environment()
env.add_python_file('/path_to_flink_app/flink_app')
"
2) Can I split the pipeline from main.py to many files like import env to another files and return datastream/table back.