Hi all, Those of you who have been following the python2/3 topics on this thread know that the industry that I work for is a bit behind the times; we're still waiting for all the libraries we need to be ported to python3. Here's a handy website for those who are curious: https://vfxpy.com/
I'm hoping that we'll have most of what we need by EOY or shortly thereafter, but on the Beam side that leaves us with a gap of 3 months or more. I was hoping that we'd be able to coast over that gap using Beam 2.24, but Dataflow dropped support for python2 a week ago [1]. So my question for the Dataflow team is this: if we lock down our pipelines to use 2.24 -- the last version of Beam to officially support python2 -- and run our SDK workers inside docker containers, will new jobs submitted to Dataflow continue to work on python2? Or will the Dataflow runner itself stop being able to execute python2 code regardless of the version of Beam used? As an example, I know that PubSubIO, which we use extensively, is not part of Beam itself, but is sorta magically patched into the pipeline by Dataflow. I will say that when I agreed that I was satisfied that Beam 2.24 would be the last python2 release to support python2, I did so under the assumption that pipelines running 2.24 would be supported for longer than a few weeks: 2.24 was released to PyPI on Sept 16, and python2 pipelines became unsupported on Dataflow October 7th. thanks, -chad [1] https://cloud.google.com/python/docs/python2-sunset/#dataflow