[
https://issues.apache.org/jira/browse/AVRO-1905?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Sean Busbey updated AVRO-1905:
------------------------------
Description:
I am under the understanding that Avro is both backward and forward compatible
(for certain schema changes). But as per my test it is neither backward nor
forward compatible.
Maven project with avro
{code}
{"namespace": "example.avro",
"type": "record",
"name": "user",
"fields": [
{"name": "name", "type": "string"},
{"name": "favorite_number", "type": "int"}
]
}
{code}
Producer
{code}
public class Producer {
public static void main(String[] args) throws IOException {
try (ByteArrayOutputStream outputStream = new ByteArrayOutputStream()) {
user u1 = user.newBuilder().setFavoriteNumber(1).setName("Amod").build();
writeBinaryEncodedAvro(u1, outputStream);
user u2 =
user.newBuilder().setFavoriteNumber(2).setName("Pandey").build();
writeBinaryEncodedAvro(u2, outputStream);
System.out.println(Arrays.toString(outputStream.toByteArray()));
}
}
static void writeBinaryEncodedAvro(SpecificRecord specificRecord,
OutputStream os) throws IOException {
BinaryEncoder binaryEncoder = EncoderFactory.get().binaryEncoder(os, null);
@SuppressWarnings("unchecked")
DatumWriter<SpecificRecord> datumWriter =
new SpecificDatumWriter<SpecificRecord>((Class<SpecificRecord>)
specificRecord.getClass());
datumWriter.write(specificRecord, binaryEncoder);
binaryEncoder.flush();
}
}
{code}
Consumer
{code}
public class Consumer {
public static void main(String[] args) throws IOException {
byte[] data = {8, 65, 109, 111, 100, 2, 10, 103, 114, 101, 101, 110};
try (ByteArrayInputStream inputStream = new ByteArrayInputStream(data)) {
System.out.println(fromBinaryMulti(inputStream));
}
}
static user fromBinary(InputStream is) throws IOException {
BinaryDecoder binaryDecoder = DecoderFactory.get().binaryDecoder(is, null);
DatumReader<user> datumReader = new SpecificDatumReader<user>(user.class);
return datumReader.read(null, binaryDecoder);
}
static List<user> fromBinaryMulti(InputStream is) throws IOException {
List<user> users = new ArrayList<user>();
BinaryDecoder binaryDecoder = DecoderFactory.get().binaryDecoder(is, null);
while (!binaryDecoder.isEnd()) {
DatumReader<user> datumReader = new SpecificDatumReader<user>(user.class);
users.add(datumReader.read(null, binaryDecoder));
}
return users;
}
}
{code}
I changed the schema to
{code}
{"namespace": "example.avro",
"type": "record",
"name": "user",
"fields": [
{"name": "name", "type": "string"},
{"name": "favorite_number", "type": "int"},
{"name": "favorite_color", "type": "string", "default": "green"}
]
}
{code}
The following does not work.
Consume code using new schema generated code cannot consume byte array
generated by old schema.
Consume code using old schema generated code cannot consume byte array
generated with new schema.
Is there a problem in the way I am trying to understand the forward or backward
compatibility.
was:
I am under the understanding that Avro is both backward and forward compatible
(for certain schema changes). But as per my test it is neither backward nor
forward compatible.
Maven project with avro
{"namespace": "example.avro",
"type": "record",
"name": "user",
"fields": [
{"name": "name", "type": "string"},
{"name": "favorite_number", "type": "int"}
]
}
Producer
public class Producer {
public static void main(String[] args) throws IOException {
try (ByteArrayOutputStream outputStream = new ByteArrayOutputStream()) {
user u1 = user.newBuilder().setFavoriteNumber(1).setName("Amod").build();
writeBinaryEncodedAvro(u1, outputStream);
user u2 =
user.newBuilder().setFavoriteNumber(2).setName("Pandey").build();
writeBinaryEncodedAvro(u2, outputStream);
System.out.println(Arrays.toString(outputStream.toByteArray()));
}
}
static void writeBinaryEncodedAvro(SpecificRecord specificRecord,
OutputStream os) throws IOException {
BinaryEncoder binaryEncoder = EncoderFactory.get().binaryEncoder(os, null);
@SuppressWarnings("unchecked")
DatumWriter<SpecificRecord> datumWriter =
new SpecificDatumWriter<SpecificRecord>((Class<SpecificRecord>)
specificRecord.getClass());
datumWriter.write(specificRecord, binaryEncoder);
binaryEncoder.flush();
}
}
Consumer
public class Consumer {
public static void main(String[] args) throws IOException {
byte[] data = {8, 65, 109, 111, 100, 2, 10, 103, 114, 101, 101, 110};
try (ByteArrayInputStream inputStream = new ByteArrayInputStream(data)) {
System.out.println(fromBinaryMulti(inputStream));
}
}
static user fromBinary(InputStream is) throws IOException {
BinaryDecoder binaryDecoder = DecoderFactory.get().binaryDecoder(is, null);
DatumReader<user> datumReader = new SpecificDatumReader<user>(user.class);
return datumReader.read(null, binaryDecoder);
}
static List<user> fromBinaryMulti(InputStream is) throws IOException {
List<user> users = new ArrayList<user>();
BinaryDecoder binaryDecoder = DecoderFactory.get().binaryDecoder(is, null);
while (!binaryDecoder.isEnd()) {
DatumReader<user> datumReader = new SpecificDatumReader<user>(user.class);
users.add(datumReader.read(null, binaryDecoder));
}
return users;
}
}
I changed the schema to
{"namespace": "example.avro",
"type": "record",
"name": "user",
"fields": [
{"name": "name", "type": "string"},
{"name": "favorite_number", "type": "int"},
{"name": "favorite_color", "type": "string", "default": "green"}
]
}
The following does not work.
Consume code using new schema generated code cannot consume byte array
generated by old schema.
Consume code using old schema generated code cannot consume byte array
generated with new schema.
Is there a problem in the way I am trying to understand the forward or backward
compatibility.
> Backward and forward compatible
> -------------------------------
>
> Key: AVRO-1905
> URL: https://issues.apache.org/jira/browse/AVRO-1905
> Project: Avro
> Issue Type: Bug
> Components: java
> Reporter: Amod Kumar Pandey
>
> I am under the understanding that Avro is both backward and forward
> compatible (for certain schema changes). But as per my test it is neither
> backward nor forward compatible.
> Maven project with avro
> {code}
> {"namespace": "example.avro",
> "type": "record",
> "name": "user",
> "fields": [
> {"name": "name", "type": "string"},
> {"name": "favorite_number", "type": "int"}
> ]
> }
> {code}
> Producer
> {code}
> public class Producer {
> public static void main(String[] args) throws IOException {
> try (ByteArrayOutputStream outputStream = new ByteArrayOutputStream()) {
> user u1 =
> user.newBuilder().setFavoriteNumber(1).setName("Amod").build();
> writeBinaryEncodedAvro(u1, outputStream);
> user u2 =
> user.newBuilder().setFavoriteNumber(2).setName("Pandey").build();
> writeBinaryEncodedAvro(u2, outputStream);
> System.out.println(Arrays.toString(outputStream.toByteArray()));
> }
> }
> static void writeBinaryEncodedAvro(SpecificRecord specificRecord,
> OutputStream os) throws IOException {
> BinaryEncoder binaryEncoder = EncoderFactory.get().binaryEncoder(os,
> null);
> @SuppressWarnings("unchecked")
> DatumWriter<SpecificRecord> datumWriter =
> new SpecificDatumWriter<SpecificRecord>((Class<SpecificRecord>)
> specificRecord.getClass());
> datumWriter.write(specificRecord, binaryEncoder);
> binaryEncoder.flush();
> }
> }
> {code}
> Consumer
> {code}
> public class Consumer {
> public static void main(String[] args) throws IOException {
> byte[] data = {8, 65, 109, 111, 100, 2, 10, 103, 114, 101, 101, 110};
> try (ByteArrayInputStream inputStream = new ByteArrayInputStream(data)) {
> System.out.println(fromBinaryMulti(inputStream));
> }
> }
> static user fromBinary(InputStream is) throws IOException {
> BinaryDecoder binaryDecoder = DecoderFactory.get().binaryDecoder(is,
> null);
> DatumReader<user> datumReader = new SpecificDatumReader<user>(user.class);
> return datumReader.read(null, binaryDecoder);
> }
> static List<user> fromBinaryMulti(InputStream is) throws IOException {
> List<user> users = new ArrayList<user>();
> BinaryDecoder binaryDecoder = DecoderFactory.get().binaryDecoder(is,
> null);
> while (!binaryDecoder.isEnd()) {
> DatumReader<user> datumReader = new
> SpecificDatumReader<user>(user.class);
> users.add(datumReader.read(null, binaryDecoder));
> }
> return users;
> }
> }
> {code}
> I changed the schema to
> {code}
> {"namespace": "example.avro",
> "type": "record",
> "name": "user",
> "fields": [
> {"name": "name", "type": "string"},
> {"name": "favorite_number", "type": "int"},
> {"name": "favorite_color", "type": "string", "default": "green"}
> ]
> }
> {code}
> The following does not work.
> Consume code using new schema generated code cannot consume byte array
> generated by old schema.
> Consume code using old schema generated code cannot consume byte array
> generated with new schema.
> Is there a problem in the way I am trying to understand the forward or
> backward compatibility.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)