Upgrading pyflink to 1.16.0

2023-01-09 Thread Ramana
.0? Thanks for your help - Ramana.

Re: Unable to run pyflink job - NetUtils getAvailablePort Error

2022-09-06 Thread Ramana
Hi Xingbo I have double checked on this, both the flink and pyflink versions that i have are 1.14.4 on Jobmanager and task manager. However, I still get this error. Thanks Ramana On Tue, Sep 6, 2022, 14:23 Xingbo Huang wrote: > Hi Raman, > > This problem comes from the inconsistenc

Unable to run pyflink job - NetUtils getAvailablePort Error

2022-09-06 Thread Ramana
resolved? Appreciate any help here. Thanks. Ramana -- DREAM IT, DO IT

Re: Pyflink :: Conversion from DataStream to TableAPI

2022-08-16 Thread Ramana
se-1.15/docs/dev/python/datastream/operators/windows/ > > Hope this helps. > > Best, > Yuan > > On Tue, 16 Aug 2022 at 3:11 PM, Ramana wrote: > >> Hi All - >> >> Trying to achieve the following - >> >> 1. Ingest the data from RMQ >> 2. Dec

Pyflink :: Conversion from DataStream to TableAPI

2022-08-16 Thread Ramana
Hi All - Trying to achieve the following - 1. Ingest the data from RMQ 2. Decompress the data read from RMQ 3. Window it for 5 mins and process the data 4. Sink the processed data. Was able to achieve step1 and step 2, however realized that Pyflink *DataStream *doesn't have window support. Given

Re: Decompressing RMQ streaming messages

2022-07-22 Thread Ramana
(buffer, 0, len); > } while (len > 0); > decompressor.end(); > bos.close(); > return bos.toByteArray(); > } > } > > hope that helps. > > On Thu, 21 Jul 2022 at 21:13, Ramana wrote: > >> Hi - We have a requirement to read the compres

Decompressing RMQ streaming messages

2022-07-21 Thread Ramana
Hi - We have a requirement to read the compressed messages emitting out of RabbitMQ and to have them processed using PyFlink. However, I am not finding any out of the box functionality in PyFlink which can help decompress the messages. Could anybody help me with an example of how to go about this?

Re: Failed to deserialize Avro record

2020-06-09 Thread Ramana Uppala
a record that cannot be > > deserialized and check debug the AvroRowDeserializationSchema with it. > > > > Best, > > > > Dawid > > > > On 06/06/2020 16:27, Ramana Uppala wrote: > > > We are using AvroRowDeserializationSchema with Kafka Table source

Re: [External Sender] Re: Flink sql nested elements

2020-06-09 Thread Ramana Uppala
After fixing sql to have the correct case, query worked as expected. Is Flink SQL case is case sensitive ? We don't see any documentation related to this. It will be great if we can convert all query elements to lower case similar to Hive. On 2020/06/09 07:58:20, Dawid Wysakowicz wr

Failed to deserialize Avro record

2020-06-06 Thread Ramana Uppala
We are using AvroRowDeserializationSchema with Kafka Table source to deserialize the messages. Application failed with "Failed to deserialize Avro record." for different messages it seems. Caused by: org.apache.avro.AvroRuntimeException: Malformed data. Length is negative: -26 Caused by: java.

Re: [External Sender] Re: Flink sql nested elements

2020-06-05 Thread Ramana Uppala
483647), `addressLine3` VARCHAR(2147483647)>> Based on the stack trace, sqlUpdate API validates the sql statement and throwing the above error. Do we need to configure any Calcite configuration to support nested types ? Thanks, Ramana. On Fri, Jun 5, 2020 at 12:49 AM Leonard Xu wrote: > Hi

Re: [External Sender] Re: Avro Arrat type validation error

2020-06-05 Thread Ramana Uppala
aving issues. Also, one more observation is if we use legacy types in TableSource creation, application not working using Blink Planner. We are getting the same error physical type not matching. Looking forward to the 1.11 changes. On Fri, Jun 5, 2020 at 3:34 AM Dawid Wysakowicz wrote: > H

Flink sql nested elements

2020-06-04 Thread Ramana Uppala
We have Avro schema that contains nested structure and when querying using Flink SQL, we are getting below error. Exception in thread "main" java.lang.AssertionError at org.apache.calcite.sql.parser.SqlParserPos.sum_(SqlParserPos.java:236) at org.apache.calcite.sql.parser.SqlPars

Creating TableSchema from the Avro Schema

2020-06-04 Thread Ramana Uppala
Hi, In Flink 1.9, we have option to create the TableSchema form TypeInformation. We have used below. TypeInformation typeInfo = AvroSchemaConverter.convertToTypeInfo(schema); TableSchema tableSchema = TableSchema.fromTypeInfo(typeInfo); However TableSchema's fromTypeInfo method is deprecated

Avro Arrat type validation error

2020-06-04 Thread Ramana Uppala
Hi, Avro schema contains Array type and we created TableSchema out of the AvroSchema and created a table in catalog. In the catalog, this specific filed type shown as ARRAY. We are using AvroRowDeserializationSchema with the connector and returnType of TableSource showing Array mapped to LEGACY