Hi guys, I am trying to use Confluent Platform 3.3.0 and the S3-Connector and I get a StackOverflowError error:
java.lang.StackOverflowError at java.util.HashMap.hash(HashMap.java:338) at java.util.LinkedHashMap.get(LinkedHashMap.java:440) at org.apache.avro.JsonProperties.getJsonProp(JsonProperties.java:141) at org.apache.avro.JsonProperties.getProp(JsonProperties.java:130) at io.confluent.connect.avro.AvroData.toConnectSchema(AvroData.java:1258) at io.confluent.connect.avro.AvroData.toConnectSchema(AvroData.java:1239) at io.confluent.connect.avro.AvroData.toConnectSchema(AvroData.java:1348) at io.confluent.connect.avro.AvroData.toConnectSchema(AvroData.java:1381) at io.confluent.connect.avro.AvroData.toConnectSchema(AvroData.java:1359) at io.confluent.connect.avro.AvroData.toConnectSchema(AvroData.java:1239) at io.confluent.connect.avro.AvroData.toConnectSchema(AvroData.java:1348) at io.confluent.connect.avro.AvroData.toConnectSchema(AvroData.java:1381) at io.confluent.connect.avro.AvroData.toConnectSchema(AvroData.java:1359) with the following schema structure: @namespace ("com.test.avro") protocol TestError { record TestError { union { null, string } type = null; union { null, array<TestError> } errors = null; } } I wrote a basic unit test for AvroData.toConnectSchema(...) and that one is also failing with the same error: @Test public void testToConnectDateAvroCustomEanError() { org.apache.avro.Schema avroSchema = new org.apache.avro.Schema.Parser().parse("{\"type\":\"record\",\"name\":\"TestError\",\"namespace\":\"com.test.avro\",\"fields\":[{\"name\":\"type\",\"type\":[\"null\",{\"type\":\"string\",\"avro.java.string\":\"String\"}],\"default\":null},{\"name\":\"errors\",\"type\":[\"null\",{\"type\":\"array\",\"items\":\"TestError\"}],\"default\":null}]}"); avroData.toConnectData(avroSchema, 10000); } Is this a known issue, or I am doing something wrong? Thanks, Catalin