Github user GraceH commented on the pull request:
https://github.com/apache/spark/pull/2892#issuecomment-60399662
Thanks @srowen. Actually, this is really an old bug exiting in the previous
Hadoop version. But I am not quite sure if there is any concern to use 1.0.4 as
the default Hadoop version now. Two possible solutions to fix that RPC
exception for eventlog over HDFS.
1. One is to upgrade the default Hadoop version to 1.1.x above. There won't
be an issue any more.
2. Another workaround is like this, to use new FSPermission() API. Both
APIs are acceptable here, although the FsPermission.createImmutable is much
more readable.
I will create a JIRA for further discussion. Please let me know your
comment there. Thanks.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]