I am running container using docker desktop.
Now I ran with default setup  and point to localhost:8099 , but it times out. 
I am also running a flink cluster and I can connect to the UI through 
localhost:8081.


------------------------------------------------------------------------------------
ext/Write/WriteImpl/FinalizeWrite:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>']
Traceback (most recent call last):
  File 
"C:\Users\ashish.raghav\AppData\Local\Programs\Python\Python37\lib\runpy.py", 
line 193, in _run_module_as_main
    "__main__", mod_spec)
  File 
"C:\Users\ashish.raghav\AppData\Local\Programs\Python\Python37\lib\runpy.py", 
line 85, in _run_code
    exec(code, run_globals)
  File "C:\Users\ashish.raghav\Desktop\projects\test-flink\wc_minimal.py", line 
141, in <module>
    run()
  File "C:\Users\ashish.raghav\Desktop\projects\test-flink\wc_minimal.py", line 
136, in run
    output | WriteToText(known_args.output)
  File 
"C:\Users\ashish.raghav\Desktop\projects\beam-venv\lib\site-packages\apache_beam\pipeline.py",
 line 524, in __exit__
    self.run().wait_until_finish()
  File 
"C:\Users\ashish.raghav\Desktop\projects\beam-venv\lib\site-packages\apache_beam\pipeline.py",
 line 497, in run
    self._options).run(False)
  File 
"C:\Users\ashish.raghav\Desktop\projects\beam-venv\lib\site-packages\apache_beam\pipeline.py",
 line 510, in run
    return self.runner.run_pipeline(self, self._options)
  File 
"C:\Users\ashish.raghav\Desktop\projects\beam-venv\lib\site-packages\apache_beam\runners\portability\portable_runner.py",
 line 404, in run_pipeline
    job_service_handle = self.create_job_service(options)
  File 
"C:\Users\ashish.raghav\Desktop\projects\beam-venv\lib\site-packages\apache_beam\runners\portability\portable_runner.py",
 line 311, in create_job_service
    return self.create_job_service_handle(server.start(), options)
  File 
"C:\Users\ashish.raghav\Desktop\projects\beam-venv\lib\site-packages\apache_beam\runners\portability\job_server.py",
 line 58, in start
    grpc.channel_ready_future(channel).result(timeout=self._timeout)
  File 
"C:\Users\ashish.raghav\Desktop\projects\beam-venv\lib\site-packages\grpc\_utilities.py",
 line 140, in result
    self._block(timeout)
  File 
"C:\Users\ashish.raghav\Desktop\projects\beam-venv\lib\site-packages\grpc\_utilities.py",
 line 86, in _block
    raise grpc.FutureTimeoutError()
grpc.FutureTimeoutError


-----Original Message-----
From: Ashish Raghav <ashish.rag...@corecompete.com> 
Sent: 28 May 2020 18:22
To: user@beam.apache.org
Subject: RE: Issue while submitting python beam pipeline on flink - local

EXTERNAL EMAIL
Do not click links or open attachments unless you recognise the sender and know 
the content is safe. Report suspicious email to info...@corecompete.com.



Ok, will try that.



-----Original Message-----
From: Maximilian Michels <m...@apache.org>
Sent: 28 May 2020 18:20
To: user@beam.apache.org
Subject: Re: Issue while submitting python beam pipeline on flink - local

EXTERNAL EMAIL
Do not click links or open attachments unless you recognise the sender and know 
the content is safe. Report suspicious email to info...@corecompete.com.



You are using the LOOPBACK environment which requires that the Flink cluster 
can connect back to your local machine. Since the loopback environment by 
defaults binds to localhost that should not be possible.

I'd suggest using the default Docker environment.

On 28.05.20 14:06, Ashish Raghav wrote:
> Now it fails with this error:
>
>
>
>
>
> WARNING:root:Make sure that locally built Python SDK docker image has 
> Python 3.7 interpreter.
>
> INFO:root:Using Python SDK docker image:
> apache/beam_python3.7_sdk:2.21.0. If the image is not available at 
> local, we will try to pull from hub.docker.com
>
> INFO:apache_beam.runners.portability.fn_api_runner.translations:======
> ============== <function lift_combiners at 0x0000027A25480CA8> 
> ====================
>
> Traceback (most recent call last):
>
>   File
> "C:\Users\ashish.raghav\AppData\Local\Programs\Python\Python37\lib\run
> py.py",
> line 193, in _run_module_as_main
>
>     "__main__", mod_spec)
>
>   File
> "C:\Users\ashish.raghav\AppData\Local\Programs\Python\Python37\lib\run
> py.py",
> line 85, in _run_code
>
>     exec(code, run_globals)
>
>   File
> "C:\Users\ashish.raghav\Desktop\projects\test-flink\wc_minimal.py",
> line 145, in <module>
>
>     run()
>
>   File
> "C:\Users\ashish.raghav\Desktop\projects\test-flink\wc_minimal.py",
> line 140, in run
>
>     output | WriteToText(known_args.output)
>
>   File
> "C:\Users\ashish.raghav\Desktop\projects\test-flink\.venv\lib\site-pac
> kages\apache_beam\pipeline.py",
> line 524, in __exit__
>
>     self.run().wait_until_finish()
>
>   File
> "C:\Users\ashish.raghav\Desktop\projects\test-flink\.venv\lib\site-pac
> kages\apache_beam\pipeline.py",
> line 497, in run
>
>     self._options).run(False)
>
>   File
> "C:\Users\ashish.raghav\Desktop\projects\test-flink\.venv\lib\site-pac
> kages\apache_beam\pipeline.py",
> line 510, in run
>
>     return self.runner.run_pipeline(self, self._options)
>
>   File
> "C:\Users\ashish.raghav\Desktop\projects\test-flink\.venv\lib\site-pac
> kages\apache_beam\runners\portability\portable_runner.py",
> line 406, in run_pipeline
>
>     job_service_handle.submit(proto_pipeline)
>
>   File
> "C:\Users\ashish.raghav\Desktop\projects\test-flink\.venv\lib\site-pac
> kages\apache_beam\runners\portability\portable_runner.py",
> line 107, in submit
>
>     prepare_response.staging_session_token)
>
>   File
> "C:\Users\ashish.raghav\Desktop\projects\test-flink\.venv\lib\site-pac
> kages\apache_beam\runners\portability\portable_runner.py",
> line 204, in stage
>
>     stager.stage_job_resources(resources, staging_location='')
>
>   File
> "C:\Users\ashish.raghav\Desktop\projects\test-flink\.venv\lib\site-pac
> kages\apache_beam\runners\portability\stager.py",
> line 305, in stage_job_resources
>
>     file_path, FileSystems.join(staging_location, staged_path))
>
>   File
> "C:\Users\ashish.raghav\Desktop\projects\test-flink\.venv\lib\site-pac
> kages\apache_beam\runners\portability\portable_stager.py",
> line 98, in stage_artifact
>
>
> self._artifact_staging_stub.PutArtifact(artifact_request_generator())
>
>   File
> "C:\Users\ashish.raghav\Desktop\projects\test-flink\.venv\lib\site-pac
> kages\grpc\_channel.py",
> line 1011, in __call__
>
>     return _end_unary_response_blocking(state, call, False, None)
>
>   File
> "C:\Users\ashish.raghav\Desktop\projects\test-flink\.venv\lib\site-pac
> kages\grpc\_channel.py", line 729, in _end_unary_response_blocking
>
>     raise _InactiveRpcError(state)
>
> grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that 
> terminated with:
>
>         status = StatusCode.UNAVAILABLE
>
>         details = "failed to connect to all addresses"
>
>         debug_error_string =
> "{"created":"@1590667551.301000000","description":"Failed to pick 
> subchannel","file":"src/core/ext/filters/client_channel/client_channel
> .cc","file_line":3962,"referenced_errors":[{"created":"@1590667551.301
> 000000","description":"failed
> to connect to all
> addresses","file":"src/core/ext/filters/client_channel/lb_policy/pick_first/pick_first.cc","file_line":394,"grpc_status":14}]}"
>
>>                                                              python 
>> -m wc_minimal --input 
>> C:\Users\ashish.raghav\Desktop\projects\test-flink\input.txt --output 
>> C:\Users\ashish.raghav\Desktop\projects\test-flink\output>
>
> WARNING:root:Make sure that locally built Python SDK docker image has 
> Python 3.7 interpreter.
>
> INFO:root:Using Python SDK docker image:
> apache/beam_python3.7_sdk:2.21.0. If the image is not available at 
> local, we will try to pull from hub.docker.com
>
> INFO:apache_beam.runners.portability.fn_api_runner.translations:======
> ============== <function lift_combiners at 0x00000243E7ADECA8> 
> ====================
>
> Traceback (most recent call last):
>
>   File
> "C:\Users\ashish.raghav\AppData\Local\Programs\Python\Python37\lib\run
> py.py",
> line 193, in _run_module_as_main
>
>     "__main__", mod_spec)
>
>   File
> "C:\Users\ashish.raghav\AppData\Local\Programs\Python\Python37\lib\run
> py.py",
> line 85, in _run_code
>
>     exec(code, run_globals)
>
>   File
> "C:\Users\ashish.raghav\Desktop\projects\test-flink\wc_minimal.py",
> line 147, in <module>
>
>     run()
>
>   File
> "C:\Users\ashish.raghav\Desktop\projects\test-flink\wc_minimal.py",
> line 142, in run
>
>     output | WriteToText(known_args.output)
>
>   File
> "C:\Users\ashish.raghav\Desktop\projects\test-flink\.venv\lib\site-pac
> kages\apache_beam\pipeline.py",
> line 524, in __exit__
>
>     self.run().wait_until_finish()
>
>   File
> "C:\Users\ashish.raghav\Desktop\projects\test-flink\.venv\lib\site-pac
> kages\apache_beam\pipeline.py",
> line 497, in run
>
>     self._options).run(False)
>
>   File
> "C:\Users\ashish.raghav\Desktop\projects\test-flink\.venv\lib\site-pac
> kages\apache_beam\pipeline.py",
> line 510, in run
>
>     return self.runner.run_pipeline(self, self._options)
>
>   File
> "C:\Users\ashish.raghav\Desktop\projects\test-flink\.venv\lib\site-pac
> kages\apache_beam\runners\portability\portable_runner.py",
> line 406, in run_pipeline
>
>     job_service_handle.submit(proto_pipeline)
>
>   File
> "C:\Users\ashish.raghav\Desktop\projects\test-flink\.venv\lib\site-pac
> kages\apache_beam\runners\portability\portable_runner.py",
> line 107, in submit
>
>     prepare_response.staging_session_token)
>
>   File
> "C:\Users\ashish.raghav\Desktop\projects\test-flink\.venv\lib\site-pac
> kages\apache_beam\runners\portability\portable_runner.py",
> line 204, in stage
>
>     stager.stage_job_resources(resources, staging_location='')
>
>   File
> "C:\Users\ashish.raghav\Desktop\projects\test-flink\.venv\lib\site-pac
> kages\apache_beam\runners\portability\stager.py",
> line 305, in stage_job_resources
>
>     file_path, FileSystems.join(staging_location, staged_path))
>
>   File
> "C:\Users\ashish.raghav\Desktop\projects\test-flink\.venv\lib\site-pac
> kages\apache_beam\runners\portability\portable_stager.py",
> line 98, in stage_artifact
>
>
> self._artifact_staging_stub.PutArtifact(artifact_request_generator())
>
>   File
> "C:\Users\ashish.raghav\Desktop\projects\test-flink\.venv\lib\site-pac
> kages\grpc\_channel.py",
> line 1011, in __call__
>
>     return _end_unary_response_blocking(state, call, False, None)
>
>   File
> "C:\Users\ashish.raghav\Desktop\projects\test-flink\.venv\lib\site-pac
> kages\grpc\_channel.py", line 729, in _end_unary_response_blocking
>
>     raise _InactiveRpcError(state)
>
> grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that 
> terminated with:
>
>         status = StatusCode.UNAVAILABLE
>
>         details = "failed to connect to all addresses"
>
>         debug_error_string =
> "{"created":"@1590667588.146000000","description":"Failed to pick 
> subchannel","file":"src/core/ext/filters/client_channel/client_channel
> .cc","file_line":3962,"referenced_errors":[{"created":"@1590667588.146
> 000000","description":"failed
> to connect to all
> addresses","file":"src/core/ext/filters/client_channel/lb_policy/pick_first/pick_first.cc","file_line":394,"grpc_status":14}]}"
>
>
>
>
>
>
>
> *From:*Ashish Raghav <ashish.rag...@corecompete.com>
> *Sent:* 28 May 2020 17:05
> *To:* user@beam.apache.org
> *Subject:* RE: Issue while submitting python beam pipeline on flink - 
> local
>
>
>
> *EXTERNAL EMAIL*
>
> Do not click links or open attachments unless you recognise the sender 
> and know the content is safe. Report suspicious email to 
> info...@corecompete.com <mailto:info...@corecompete.com>.
>
>
>
> Ok. Will test with latest version.
>
>
>
> *From:*Kyle Weaver <kcwea...@google.com <mailto:kcwea...@google.com>>
> *Sent:* 28 May 2020 17:03
> *To:* user@beam.apache.org <mailto:user@beam.apache.org>
> *Subject:* Re: Issue while submitting python beam pipeline on flink - 
> local
>
>
>
> *EXTERNAL EMAIL*
>
> Do not click links or open attachments unless you recognise the sender 
> and know the content is safe. Report suspicious email to 
> info...@corecompete.com <mailto:info...@corecompete.com>.
>
>
>
> Oh, I see the issue. The documentation you were looking at might 
> contain an outdated reference to the apachebeam repo on Docker hub. We 
> have migrated all Docker images to the apache top-level repository. So 
> instead of apachebeam/flink1.9_job_server, you should use 
> apache/beam_flink1.9_job_server.
>
>
>
> On Thu, May 28, 2020 at 7:31 AM Kyle Weaver <kcwea...@google.com 
> <mailto:kcwea...@google.com>> wrote:
>
>     2.21.0 should be available
>     now:
> https://hub.docker.com/layers/apache/beam_flink1.9_job_server/2.21.0/i
> mages/sha256-eeac6dd4571794a8f985e9967fa0c1522aa56a28b5b0a0a34490a6000
> 65f096d?context=explore
>
>
>
>     On Thu, May 28, 2020 at 7:27 AM Ashish Raghav
>     <ashish.rag...@corecompete.com
>     <mailto:ashish.rag...@corecompete.com>> wrote:
>
>         Hi Kyle,
>
>
>
>         The Latest Version available on docker hub is 2.20.0
>
>
>
>
>
>         *From:*Kyle Weaver <kcwea...@google.com
>         <mailto:kcwea...@google.com>>
>         *Sent:* 28 May 2020 16:51
>         *To:* user@beam.apache.org <mailto:user@beam.apache.org>
>         *Subject:* Re: Issue while submitting python beam pipeline on
>         flink - local
>
>
>
>         *EXTERNAL EMAIL*
>
>         Do not click links or open attachments unless you recognise the
>         sender and know the content is safe. Report suspicious email to
>         info...@corecompete.com <mailto:info...@corecompete.com>.
>
>
>
>         Hi Ashish, can you check to make
>         sure apachebeam/flink1.9_job_server is also on version 2.21.0?
>
>
>
>         On Thu, May 28, 2020 at 7:13 AM Ashish Raghav
>         <ashish.rag...@corecompete.com
>         <mailto:ashish.rag...@corecompete.com>> wrote:
>
>             Hello Guys ,
>
>
>
>             I am trying to run a python beam pipeline on flink. I am
>             trying to run apache_beam.examples.wordcount_minimal but
>             with Pipelineoptions as
>
>             "--runner=PortableRunner",
>             "--job_endpoint=192.168.99.100:8099
>             <http://192.168.99.100:8099>",
>             "--environment_type=LOOPBACK",
>
>
>
>             I have a apachebeam/flink1.9_job_server running on local
>             container.
>             Whenever I submit the job , I get errors ( attached the
>             log).When I run it with DirectRunner, the code runs fine
>             though.
>
>
>
>             versions:
>
>             Apache Beam SDK=2.21.0
>             python=3.7.6
>
>
>
>             I have followed all instruction from the apache beam UI but
>             I seem to have hit a roadblock here. Please suggest.
>
>
>
>             Please let me know if any other information is required.
>
>
>
>             *Ashish Raghav | DE*
>
>             *Core Compete
>             <https://corecompete.com/>* | ashish.rag...@corecompete.com
>             <mailto:adwait.th...@corecompete.com>
>
>             /Accelerating Cloud Analytics/
>
>
>

Reply via email to