Hey Ankit, sorry for the issues with the mailing lists. Its our fault that we didn't properly mark the issues@ list as a read-only mailing list on our website. I've updated the website to make it more clear: http://flink.apache.org/community.html#mailing-lists
Regarding the problem you've reported: Are you using Flink with YARN or do you have it installed directly on the cluster? Either way, Flink doesn't have support for secure Hadoop environments right now. I'll make adding security support for Flink on YARN a top priority now. Best, Robert On Tue, Jan 27, 2015 at 6:43 AM, Henry Saputra <henry.sapu...@gmail.com> wrote: > Ankit, > > Please subscribe to dev@ list by sending email to > dev-subscr...@flink.apache.org [1] > > [1] http://flink.apache.org/community.html#mailing-lists > > On Mon, Jan 26, 2015 at 9:25 PM, Kostas Tzoumas <ktzou...@apache.org> > wrote: > > I am forwarding this as I could not approve it for some reason. > > > > Kostas > > > > > > ---------- Forwarded message ---------- > > From: Ankit Jhalaria <anki...@yahoo-inc.com.invalid> > > To: "dev@flink.apache.org" <dev@flink.apache.org> > > Cc: > > Date: Mon, 26 Jan 2015 22:59:53 +0000 (UTC) > > Subject: [Flink reading HDFS] : SIMPLE authentication is not enabled. > > Available:[TOKEN, KERBEROS] > > Hey guys, > > > > I am trying to setup a Flink cluster that can read from a Hadoop Cluster. > > We have a total of 8 machines on the Flink cluster and on all those > > machines can access HDFS on the Hadoop cluster. > > > > Flink Version : flink-0.9-SNAPSHOT > > Hadoop 2.5.0.8.1411070359 > > > > When I try to run a program that reads from HDFS, I get the following > error > > [Stack Trace shown below]. How do i enable flink to use kerberos > > authentication? Any clues would be appreciated. Tried sending emails > > earlier to *issues*@flink.apache.org. Got a bounce back email everytime. > > > > *Caused by: > > > org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): > > SIMPLE authentication is not enabled. Available:[TOKEN, KERBEROS]* > > at org.apache.hadoop.ipc.Client.call(Client.java:1347) > > at org.apache.hadoop.ipc.Client.call(Client.java:1300) > > at > > > org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206) > > at com.sun.proxy.$Proxy9.getFileInfo(Unknown Source) > > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > > at > > > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) > > at > > > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > > at java.lang.reflect.Method.invoke(Method.java:601) > > at > > > org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:186) > > at > > > org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) > > at com.sun.proxy.$Proxy9.getFileInfo(Unknown Source) > > at > > > org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:651) > > at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1679) > > > > Thanks, > > Ankit >