D'oh! Of course is add -Dhadoop-20S.version=1.3.0-SNAPSHOT So it picks my Hadoop version of choice (also -Phadoop-1 now).
Thanks, ~Remus -----Original Message----- From: Remus Rusanu [mailto:[email protected]] Sent: Thursday, November 28, 2013 4:44 PM To: [email protected] Subject: What determines the shims used during testing? On my Windows system test fail as Context.getScratchDir fails: java.io.IOException: Failed to set permissions of path: \HW\project\hive-monarch\itests\qtest\target\tmp\scratchdir\hive_2013-11-28_16-20-53_995_395869757529987755-1 to 0700 at org.apache.hadoop.fs.FileUtil.checkReturnValue(FileUtil.java:691) at org.apache.hadoop.fs.FileUtil.setPermission(FileUtil.java:664) at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:514) at org.apache.hadoop.fs.FilterFileSystem.setPermission(FilterFileSystem.java:290) at org.apache.hadoop.fs.FilterFileSystem.setPermission(FilterFileSystem.java:290) at org.apache.hadoop.fs.ProxyFileSystem.setPermission(ProxyFileSystem.java:290) at org.apache.hadoop.fs.FilterFileSystem.setPermission(FilterFileSystem.java:290) at org.apache.hadoop.hive.ql.Context.getScratchDir(Context.java:212) at org.apache.hadoop.hive.ql.Context.getExternalScratchDir(Context.java:272) at org.apache.hadoop.hive.ql.Context.getExternalTmpFileURI(Context.java:365) .... This seems related to the platform specific chmod, which on Windows relies on winutils.exe But I don't understand how does Hive testing decide what Hadoop shim to use. The ShimLoader.getMajorVersion uses VersionInfo.getVersion() and in debugger I can see this returns "1.2.1" and accordingly the -core-1.2.1 JARs are used, from my Maven cache. This seems to ignore my current HADOOP_HOME and any CLASSPATH I try to set. The loaded JARs are not working properly on Windows and that causes my problems. How can I control, when I run mvn test, what shims are being used? Thanks, ~Remus
