Dear all,

Today we try to load parquet file with partition as instructed in <
https://spark.apache.org/docs/1.3.1/sql-programming-guide.html#partition-discovery>
:

```
sqlContext.parquetFile("hdfs:///bwlogs/beta/archive/EC.Buy/_year=2015/_month=06/_day=11")
```

but we got `java.lang.IllegalArgumentException: Could not find Parquet
metadata at path
hdfs://bwhdfscluster/bwlogs/beta/archive/EC.Buy/_year=2015/_month=06/_day=11`

However, if I new a HiveContext by myself:

```
val hc = new org.apache.spark.sql.hive.HiveContext(sc)
hc.parquetFile("hdfs:///bwlogs/beta/archive/EC.Buy/_year=2015/_month=06/_day=11")
```

It works.

Is this a bug? Or did I make a mistake in configuration my hdfs cluster?

Thanks,
Wush

Reply via email to