Arrow Build Report for Job nightly-2020-02-08-0
All tasks:
https://github.com/ursa-labs/crossbow/branches/all?query=nightly-2020-02-08-0
Failed Tasks:
- conda-win-vs2015-py36:
URL:
https://github.com/ursa-labs/crossbow/branches/all?query=nightly-2020-02-08-0-azure-conda-win-vs2015-py36
- co
bb created ARROW-7805:
-
Summary: Apache Arrow HDFS Remove (rm) operation defaults to
SkipTrash
Key: ARROW-7805
URL: https://issues.apache.org/jira/browse/ARROW-7805
Project: Apache Arrow
Issue Type: Imp
I'm asking because it doesn't seem to do validation that the schemas are
equivalent between the array being written and the original schema?
It also appears to only be used in unit tests.
Thanks,
Micah
I'd like to understand if any one is making use of the following features
and if we should revisit them before 1.0.
1. Dictionaries can encode null values.
- This become error prone for things like parquet. We seem to be
calculating the definition level solely based on the null bitmap.
I might h