Hi Flink Community,

We are trying to upgrade our Flink SQL job from 1.10 to 1.11. We used Kafka 
source table, and the data is stored in Kafka in Avro format.
Schema is like this:


{
  "type": "record",
  "name": "event",
  "namespace": "busseniss.event",
  "fields": [
    {
      "name": "header",
      "type": {
        "type": "record",
        "name": "header",
        "fields": [
          {
            "name": "createTimestamp",
            "type": "long"
          },
          {
            "name": "sentTimestamp",
            "type": "long"
          },
          {
            "name": "eventId",
            "type": [
              "null",
              {
                "type": "string",
                "avro.java.string": "String"
              }
            ]
          }
        ]
      },
      "doc": "Rheos header "
    },
    {
      "name": "si",
      "type": [
        "null",
        "string"
      ]
    }
]
}


















--

Hongjian Peng
Department of Computer Science and Engineering
Shanghai Jiao Tong University
Email: super...@163.com

Reply via email to