I'm trying to write Beam row directly to bigquery because it would go
through less conversion and more efficient but there is some weird error
happening
A nullable array field would throw

Caused by: java.lang.IllegalArgumentException: Received null value for
non-nullable field

If I set null for that field

Here is code in beam I found related

https://github.com/apache/beam/blob/111f4c34ab2efd166de732c32d99ff615abf6064/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/bigquery/BeamRowToStorageApiProto.java#L277

  private static Object messageValueFromRowValue(
      FieldDescriptor fieldDescriptor, Field beamField, int index, Row row)
{
    @Nullable Object value = row.getValue(index);
    if (value == null) {
      if (fieldDescriptor.isOptional()) {
        return null;
      } else {
        throw new IllegalArgumentException(
            "Received null value for non-nullable field " +
fieldDescriptor.getName());
      }
    }
    return toProtoValue(fieldDescriptor, beamField.getType(), value);
  }

line 277 why not use beamField.isNullable() instead of
fieldDescriptior.isOptional() It it's useing beam schema it should stick to
nullable setting on beam schema field, correct?

And how do I avoid this?

Regards,
Siyuan

Reply via email to