Yes, metrics won't work for this use case.
Before we had proper metrics support, accumulators were often used as a
work around.

In general, the Table API & SQL in Flink are designed to keep all data in
tables and not "leak" data on the side.

Best, Fabian

2018-04-04 22:08 GMT+02:00 Darshan Singh <darshan.m...@gmail.com>:

> I doubt the metrics will work as I will need to get the String output. I
> will need to use figure out something else.
>
> Basically I will pass my function a string and will get some columns back
> but if my string is something special. Then I will need to get extra
> information. I am using the data set so I think I wont be able to use
> side-output.
>
> As of now I return my POJO along-with the String(only if it is special)
> and then i select these from table itself. I was just wondering if there
> are other better alternate where my existing table return my POJO and I
> somehow find those special strings once job is done. The number of these
> strings is very small. I thought I could create a custom accumulator which
> will work more like list and in the end I will get all values.
>
> I guess I will need to keep trying something.
>
> Thanks
>
> On Wed, Apr 4, 2018 at 9:01 PM, Darshan Singh <darshan.m...@gmail.com>
> wrote:
>
>> Thanks for getting back to me. I was trying to see various options to
>> accumulate(sort of list append) some of my data. The amount of appended
>> data is quite small so I wanted to see if I can use accumulators and once
>> the job is done I can get all the data and then use the way I want to use.
>>
>> I guess I will try the metrics as well to see how that goes.
>>
>> Thanks
>>
>> On Wed, Apr 4, 2018 at 8:51 PM, Fabian Hueske <fhue...@gmail.com> wrote:
>>
>>> Hi Darshan,
>>>
>>> Accumulators are not exposed to UDFs of the Table API / SQL.
>>> What's your use case for these? Would metrics do the job as well?
>>>
>>> Best, Fabian
>>>
>>> 2018-04-04 21:31 GMT+02:00 Darshan Singh <darshan.m...@gmail.com>:
>>>
>>>> Hi,
>>>>
>>>> I would like to use accumulators with table /scalar functions. However,
>>>> I am not able to figure out how to get the runtime context from inside of
>>>> scalar function open method.
>>>> Only thing i see is function context which can not provide the runtime
>>>> context.
>>>>
>>>> I tried using AbstractRichFunction.getRuntimeContext()
>>>> <https://ci.apache.org/projects/flink/flink-docs-release-1.3/api/java/org/apache/flink/api/common/functions/AbstractRichFunction.html#getRuntimeContext-->
>>>> .  which I guess is not allowed from open of scalar or table functions.
>>>>
>>>> Any help will be appreciated.
>>>>
>>>> Thanks
>>>>
>>>
>>>
>>
>

Reply via email to