[ https://issues.apache.org/jira/browse/HIVE-19201?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16640386#comment-16640386 ]
Nishant Bangarwa commented on HIVE-19201: ----------------------------------------- [~Korge] Make sure the column names in druid are all in lowercase. If you have a column in druid which has upper case characters and hive sends the query with lowercase column names, it can lead to getting null in results. > Hive doesn't read Druid data correctly > -------------------------------------- > > Key: HIVE-19201 > URL: https://issues.apache.org/jira/browse/HIVE-19201 > Project: Hive > Issue Type: Bug > Components: Druid integration, Hive > Affects Versions: 2.3.0 > Environment: Ubuntu 16.4 TLS Desktop > Druid 0.12.0 (StandAlone - Quickstart conf) > Hive 2.3.0 (StandAlone - Quickstart conf) > Also have Hadoop and Zookeeper running > Reporter: Tournadre > Priority: Blocker > Attachments: DruidIntegration.tar > > > I created an external table on hive pointing at my datasource : wikiticker on > druid (I already defined the broker adress). However few colums appear as > NULL without knowing why. > I have also many time *.lck file in the metadatadb on Hive and Druid to > delete (otherwise, queries don't work) > So the describe statement lists the column and their type correctly, however > I only get "NULL" when I query some column (string type). I checked in my > Druid, the data are still here. > > Help ! :( > PS : Sorry first issue and JIRA, didn't see my issue resolved yet. -- This message was sent by Atlassian JIRA (v7.6.3#76005)