Thank you very much for coming back and reporting the good news! :-)
If you think that there is something that we can do to improve Flink's ORC
input format, for example log a warning, please open a Jira.
Thank you,
Fabian
Am Mi., 25. Sept. 2019 um 05:14 Uhr schrieb 163 :
> Hi Fabian,
>
> After
Hi Fabian,
After debugging in local mode, I found that Flink orc connector is no problem,
but some fields in our schema is in capital form,so these fields can not be
matched.
But the program directly read orc file using includeColumns method, which will
use equalsIgnoreCase to match the column,
Hi QiShu,
It might be that Flink's OrcInputFormat has a bug.
Can you open a Jira issue to report the problem?
In order to be able to fix this, we need as much information as possible.
It would be great if you could create a minimal example of an ORC file and
a program that reproduces the issue.
If
Hi Guys,
The Flink version is 1.9.0. I use OrcTableSource to read ORC file in HDFS and
the job is executed successfully, no any exception or error. But some
fields(such as tagIndustry) are always null, actually these fields are not
null. I can read these fields by direct reading it. Below is m