Architectural question
Hi all, I have some architectural question. For my app I have persistent 50 GB data, which stored in HDFS, data is simple CSV format file. Also for my app which should be run over this (50 GB) data I have 10 GB input data also CSV format. Persistent data and input data don't have commons keys. In
[jira] [Created] (HADOOP-13063) Incorrect error message while setting dfs.block.size to wrong value
Oleksiy Sayankin created HADOOP-13063: - Summary: Incorrect error message while setting dfs.block.size to wrong value Key: HADOOP-13063 URL: https://issues.apache.org/jira/browse/HADOOP-13063