Re: Flink to BigTable

2021-01-25 Thread Pierre Oberholzer
HBase and > I remember someone telling me that Google has made a library available > which is effectively the HBase client which talks to BigTable in the > backend. > > Like I said: I haven't tried this yet myself. > > Niels Basjes > > Op zo 24 jan. 2021 om 19:2

Flink to BigTable

2021-01-24 Thread Pierre Oberholzer
Dear Community, I would like to use BigTable as a sink for a Flink job: 1) Is there a connector out-of-the-box ? 2) Can I use Datastream API ? 3) How can I optimally pass a sparse object (99% sparsity), i.e. ensure no key/value are created in BigTable for nulls ? I have searched the documentation

Re: PyFlink - Scala UDF - How to convert Scala Map in Table API?

2020-12-03 Thread Pierre Oberholzer
flink/blob/master/flink-python/pyflink/fn_execution/beam/beam_coder_impl_slow.py#L100 > [3] > https://github.com/apache/flink/blob/master/flink-python/pyflink/fn_execution/coder_impl_fast.pyx#L697 > > Best, > Xingbo > > Pierre Oberholzer 于2020年12月3日周四 下午3:08写道: > >> H

Re: PyFlink - Scala UDF - How to convert Scala Map in Table API?

2020-12-02 Thread Pierre Oberholzer
[2] > https://ci.apache.org/projects/flink/flink-docs-release-1.11/dev/table/connectors/formats/avro.html#avro-format > > Best, > Xingbo > > Pierre Oberholzer 于2020年12月3日周四 上午2:57写道: > >> Hi Xingbo, >> >> Nice ! This looks a bit hacky, but shows that it can be

Re: PyFlink - Scala UDF - How to convert Scala Map in Table API?

2020-12-02 Thread Pierre Oberholzer
t;, > f3 TIMESTAMP(3) > ) WITH ( > 'connector' = 'print' > ) > """) > result_type = DataTypes.ROW( > [DataTypes.FIELD("f%s" % i, DataTypes.INT()) for i in > range(num_field)]) > &

Re: PyFlink - Scala UDF - How to convert Scala Map in Table API?

2020-12-01 Thread Pierre Oberholzer
ring your entire json as a string field in > `Table` and putting the parsing work in UDF? > > Best, > Xingbo > > Pierre Oberholzer 于2020年12月1日周二 上午4:13写道: > >> Hi Xingbo, >> >> Many thanks for your follow up. Yes you got it right. >> So using Table API

Re: PyFlink - Scala UDF - How to convert Scala Map in Table API?

2020-11-30 Thread Pierre Oberholzer
.11/dev/python/table-api-users-guide/python_types.html > > Best, > Xingbo > > 2020年11月28日 上午12:49,Pierre Oberholzer 写道: > > Hello Wei, Dian, Xingbo, > > Not really sure when it is appropriate to knock on the door of the > community ;) > I just wanted to mention that

Re: PyFlink - Scala UDF - How to convert Scala Map in Table API?

2020-11-20 Thread Pierre Oberholzer
gt;> st_env.get_config().get_configuration().set_string("pipeline.jars", " >> file:///Users/zhongwei/the-dummy-udf.jar") >> >> # register the udf via >> st_env.execute_sql("CREATE FUNCTION dummyMap AS 'com.dummy.dummyMap' >> LANGU

Re: PyFlink - Scala UDF - How to convert Scala Map in Table API?

2020-11-18 Thread Pierre Oberholzer
ot;com.dummy.dummyMap") > > # prepare source and sink > t = st_env.from_elements([(1, 'hi', 'hello'), (2, 'hi', 'hello')], ['a', > 'b', 'c']) > st_env.execute_sql("""create table mySink ( >

Re: PyFlink - Scala UDF - How to convert Scala Map in Table API?

2020-11-17 Thread Pierre Oberholzer
the old type system. In this situation you need to override the > 'getResultType' method instead of adding type hint. > > You can also try to register your UDF via the "CREATE FUNCTION" sql > statement, which accepts the type hint. > > Best, > Wei > > 在 20

Re: PyFlink - Scala UDF - How to convert Scala Map in Table API?

2020-11-17 Thread Pierre Oberholzer
gards, Le mar. 17 nov. 2020 à 10:04, Wei Zhong a écrit : > Hi Pierre, > > You can try to replace the '@DataTypeHint("ROW")' with > '@FunctionHint(output = new DataTypeHint("ROW”))' > > Best, > Wei > > 在 2020年11月17日,15:45,Pierre Oberhol

Re: PyFlink - Scala UDF - How to convert Scala Map in Table API?

2020-11-16 Thread Pierre Oberholzer
match. Query result schema: [output_of_my_scala_udf: GenericType] TableSink schema:[output_of_my_scala_udf: Row(s: String, t: String)] Le ven. 13 nov. 2020 à 11:59, Pierre Oberholzer a écrit : > Thanks Dian, but same error when using explicit returned type: > > class dummyMap() ex

Re: PyFlink - Scala UDF - How to convert Scala Map in Table API?

2020-11-13 Thread Pierre Oberholzer
fer to the corresponding documentation. > > [1] > https://ci.apache.org/projects/flink/flink-docs-release-1.11/dev/table/functions/udfs.html#implementation-guide > > 在 2020年11月13日,下午4:56,Pierre Oberholzer 写道: > > ScalarFunction > > > -- Pierre Oberholzer Hohlstrasse 507 CH - 8048 Zürich +41 77 402 17 07

PyFlink - Scala UDF - How to convert Scala Map in Table API?

2020-11-13 Thread Pierre Oberholzer
Hi, I'm trying to use a Map[String,String] object output of a Scala UDF ( scala.collection.immutable.map) as a valid data type in the Table API, namely via Java type (java.util.Map) as recommended here , Howeve