Hi all,
I need to write a DataStream with a java.util.Instant field to
parquet files in S3. I couldn't find any straightforward way to do that, so
I changed that POJO class to Avro SpecificRecord (I followed this example
https://github.com/aws-samples/amazon-managed-service-for-apache-flink-exampl
Hi all,
I'm trying to follow the code in 1.16 SNAPSHOT to have a Kinesis sink in
PyFlink 1.15, to write the output of a KeyedCoProcessFunction to Kinesis.
1. If I use ".set_serialization_schema(SimpleStringSchema())", then I got
the error message:
java.lang.ClassCastException: class [B cannot be
Great, thanks!
Kind regards,
Levan Huyen
On Fri, 21 Oct 2022 at 00:53, Biao Geng wrote:
> You are right.
> It contains the python package `pyflink` and some dependencies like py4j
> and cloudpickle but does not contain all relevant dependencies(e.g.
> `google.protobuf` as the err
Thanks Biao.
May I ask one more question: does the binary package on Apache site (e.g:
https://archive.apache.org/dist/flink/flink-1.15.2) contain the python
package `pyflink` and its dependencies? I guess the answer is no.
Thanks and regards,
Levan Huyen
On Thu, 20 Oct 2022 at 18:13, Biao Geng
install another version using
`pip install`. It will also be confusing about where to add the
dependencies jars.
Thanks and regards,
Levan Huyen
On Thu, 20 Oct 2022 at 02:25, Biao Geng wrote:
> Hi Levan,
>
> For your setup1 & 2, it looks like the python environment is not ready.
&g
Hi,
I'm new to PyFlink, and I couldn't run a basic example that shipped with
Flink.
This is the command I tried:
./bin/flink run -py examples/python/datastream/word_count.py
Here below are the results I got with different setups:
1. On AWS EMR 6.8.0 (Flink 1.15.1):
*Error: No module named 'goog