.0?
Thanks for your help
- Ramana.
Hi Xingbo
I have double checked on this, both the flink and pyflink versions that i
have are 1.14.4 on Jobmanager and task manager.
However, I still get this error.
Thanks
Ramana
On Tue, Sep 6, 2022, 14:23 Xingbo Huang wrote:
> Hi Raman,
>
> This problem comes from the inconsistenc
resolved?
Appreciate any help here.
Thanks.
Ramana
--
DREAM IT, DO IT
se-1.15/docs/dev/python/datastream/operators/windows/
>
> Hope this helps.
>
> Best,
> Yuan
>
> On Tue, 16 Aug 2022 at 3:11 PM, Ramana wrote:
>
>> Hi All -
>>
>> Trying to achieve the following -
>>
>> 1. Ingest the data from RMQ
>> 2. Dec
Hi All -
Trying to achieve the following -
1. Ingest the data from RMQ
2. Decompress the data read from RMQ
3. Window it for 5 mins and process the data
4. Sink the processed data.
Was able to achieve step1 and step 2, however realized that Pyflink *DataStream
*doesn't have window support. Given
(buffer, 0, len);
> } while (len > 0);
> decompressor.end();
> bos.close();
> return bos.toByteArray();
> }
> }
>
> hope that helps.
>
> On Thu, 21 Jul 2022 at 21:13, Ramana wrote:
>
>> Hi - We have a requirement to read the compres
Hi - We have a requirement to read the compressed messages emitting out of
RabbitMQ and to have them processed using PyFlink. However, I am not
finding any out of the box functionality in PyFlink which can help
decompress the messages.
Could anybody help me with an example of how to go about this?
a record that cannot be
> > deserialized and check debug the AvroRowDeserializationSchema with it.
> >
> > Best,
> >
> > Dawid
> >
> > On 06/06/2020 16:27, Ramana Uppala wrote:
> > > We are using AvroRowDeserializationSchema with Kafka Table source
After fixing sql to have the correct case, query worked as expected.
Is Flink SQL case is case sensitive ? We don't see any documentation related to
this.
It will be great if we can convert all query elements to lower case similar to
Hive.
On 2020/06/09 07:58:20, Dawid Wysakowicz wr
We are using AvroRowDeserializationSchema with Kafka Table source to
deserialize the messages. Application failed with "Failed to deserialize Avro
record." for different messages it seems.
Caused by: org.apache.avro.AvroRuntimeException: Malformed data. Length is
negative: -26
Caused by: java.
483647), `addressLine3` VARCHAR(2147483647)>>
Based on the stack trace, sqlUpdate API validates the sql statement and
throwing the above error. Do we need to configure any Calcite
configuration to support nested types ?
Thanks,
Ramana.
On Fri, Jun 5, 2020 at 12:49 AM Leonard Xu wrote:
> Hi
aving issues.
Also, one more observation is if we use legacy types in TableSource
creation, application not working using Blink Planner. We are getting the
same error physical type not matching.
Looking forward to the 1.11 changes.
On Fri, Jun 5, 2020 at 3:34 AM Dawid Wysakowicz
wrote:
> H
We have Avro schema that contains nested structure and when querying using
Flink SQL, we are getting below error.
Exception in thread "main" java.lang.AssertionError
at
org.apache.calcite.sql.parser.SqlParserPos.sum_(SqlParserPos.java:236)
at org.apache.calcite.sql.parser.SqlPars
Hi,
In Flink 1.9, we have option to create the TableSchema form TypeInformation. We
have used below.
TypeInformation typeInfo = AvroSchemaConverter.convertToTypeInfo(schema);
TableSchema tableSchema = TableSchema.fromTypeInfo(typeInfo);
However TableSchema's fromTypeInfo method is deprecated
Hi,
Avro schema contains Array type and we created TableSchema out of the
AvroSchema and created a table in catalog. In the catalog, this specific filed
type shown as ARRAY. We are using
AvroRowDeserializationSchema with the connector and returnType of TableSource
showing Array mapped to LEGACY
15 matches
Mail list logo