I am not able to figure out, got stuck badly in this since last 1 week. Any
little help would be appreciated.


2018-06-16 19:25:10,523 DEBUG
org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  -
Parallelism set: 1 for 8

2018-06-16 19:25:10,578 DEBUG
org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  -
Parallelism set: 1 for 1

2018-06-16 19:25:10,588 DEBUG
org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  -
CONNECTED: KeyGroupStreamPartitioner - 1 -> 8

2018-06-16 19:25:10,591 DEBUG
org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  -
Parallelism set: 1 for 5

2018-06-16 19:25:10,597 DEBUG
org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  -
CONNECTED: KeyGroupStreamPartitioner - 5 -> 8

2018-06-16 19:25:10,618 FATAL org.apache.hadoop.conf.Configuration
                - error parsing conf core-default.xml

javax.xml.parsers.ParserConfigurationException: Feature '
http://apache.org/xml/features/xinclude' is not recognized.

at
org.apache.xerces.jaxp.DocumentBuilderFactoryImpl.newDocumentBuilder(Unknown
Source)

at
org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2482)

at
org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2444)

at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2361)

at org.apache.hadoop.conf.Configuration.get(Configuration.java:1188)

at
org.apache.hadoop.yarn.factory.providers.RecordFactoryProvider.getRecordFactory(RecordFactoryProvider.java:49)

at org.apache.hadoop.yarn.util.Records.<clinit>(Records.java:32)

at
org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getQueueInfoRequest(YarnClientImpl.java:495)

at
org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getAllQueues(YarnClientImpl.java:525)

at
org.apache.flink.yarn.AbstractYarnClusterDescriptor.checkYarnQueues(AbstractYarnClusterDescriptor.java:658)

at
org.apache.flink.yarn.AbstractYarnClusterDescriptor.deployInternal(AbstractYarnClusterDescriptor.java:486)

at
org.apache.flink.yarn.YarnClusterDescriptor.deployJobCluster(YarnClusterDescriptor.java:75)

at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:235)

at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210)

at
org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020)

at
org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096)

at java.security.AccessController.doPrivileged(Native Method)

at javax.security.auth.Subject.doAs(Subject.java:422)

at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1692)

at
org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)

at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1096)

2018-06-16 19:25:10,620 WARN
org.apache.flink.yarn.AbstractYarnClusterDescriptor
          - Error while getting queue information from YARN: null

2018-06-16 19:25:10,621 DEBUG
org.apache.flink.yarn.AbstractYarnClusterDescriptor           - Error
details

java.lang.ExceptionInInitializerError

at
org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getQueueInfoRequest(YarnClientImpl.java:495)

at
org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getAllQueues(YarnClientImpl.java:525)

at
org.apache.flink.yarn.AbstractYarnClusterDescriptor.checkYarnQueues(AbstractYarnClusterDescriptor.java:658)

at
org.apache.flink.yarn.AbstractYarnClusterDescriptor.deployInternal(AbstractYarnClusterDescriptor.java:486)

at
org.apache.flink.yarn.YarnClusterDescriptor.deployJobCluster(YarnClusterDescriptor.java:75)

at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:235)

at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210)

at
org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020)

at
org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096)

at java.security.AccessController.doPrivileged(Native Method)

at javax.security.auth.Subject.doAs(Subject.java:422)

at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1692)

at
org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)

at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1096)

Caused by: java.lang.RuntimeException:
javax.xml.parsers.ParserConfigurationException: Feature '
http://apache.org/xml/features/xinclude' is not recognized.

at
org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2600)

at
org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2444)

at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2361)

at org.apache.hadoop.conf.Configuration.get(Configuration.java:1188)

at
org.apache.hadoop.yarn.factory.providers.RecordFactoryProvider.getRecordFactory(RecordFactoryProvider.java:49)

at org.apache.hadoop.yarn.util.Records.<clinit>(Records.java:32)

... 14 more

Caused by: javax.xml.parsers.ParserConfigurationException: Feature '
http://apache.org/xml/features/xinclude' is not recognized.

at
org.apache.xerces.jaxp.DocumentBuilderFactoryImpl.newDocumentBuilder(Unknown
Source)

at
org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2482)

... 19 more

2018-06-16 19:25:10,627 DEBUG org.apache.hadoop.service.AbstractService
                - Service:
org.apache.hadoop.yarn.client.api.impl.YarnClientImpl entered state STOPPED

2018-06-16 19:25:10,628 DEBUG org.apache.hadoop.ipc.Client
                - stopping client from cache:
org.apache.hadoop.ipc.Client@32c726ee

2018-06-16 19:25:10,628 DEBUG org.apache.hadoop.ipc.Client
                - removing client from cache:
org.apache.hadoop.ipc.Client@32c726ee

2018-06-16 19:25:10,628 DEBUG org.apache.hadoop.ipc.Client
                - stopping actual client because no more references remain:
org.apache.hadoop.ipc.Client@32c726ee

2018-06-16 19:25:10,628 DEBUG org.apache.hadoop.ipc.Client
                - Stopping client

2018-06-16 19:25:10,629 DEBUG org.apache.hadoop.service.AbstractService
                - Service:
org.apache.hadoop.yarn.client.api.impl.TimelineClientImpl entered state
STOPPED

2018-06-16 19:25:10,630 ERROR org.apache.flink.client.cli.CliFrontend
                - Fatal error while running command line interface.

java.lang.NoClassDefFoundError: Could not initialize class
org.apache.hadoop.yarn.util.Records

at
org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.createApplication(YarnClientImpl.java:212)

at
org.apache.flink.yarn.AbstractYarnClusterDescriptor.deployInternal(AbstractYarnClusterDescriptor.java:497)

at
org.apache.flink.yarn.YarnClusterDescriptor.deployJobCluster(YarnClusterDescriptor.java:75)

at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:235)

at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210)

at
org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020)

at
org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096)

at java.security.AccessController.doPrivileged(Native Method)

at javax.security.auth.Subject.doAs(Subject.java:422)

at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1692)

at
org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)

at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1096)


On Fri, Jun 15, 2018 at 9:35 PM Till Rohrmann <trohrm...@apache.org> wrote:

> Hmm could you maybe share the client logs with us.
>
> Cheers,
> Till
>
> On Fri, Jun 15, 2018 at 4:54 PM Garvit Sharma <garvit...@gmail.com> wrote:
>
>> Yes, I did.
>>
>> On Fri, Jun 15, 2018 at 6:17 PM Till Rohrmann <trohrm...@apache.org>
>> wrote:
>>
>>> Hi Garvit,
>>>
>>> have you exported the HADOOP_CLASSPATH as described in the release notes
>>> [1]?
>>>
>>> [1]
>>> https://ci.apache.org/projects/flink/flink-docs-release-1.5/release-notes/flink-1.5.html#hadoop-classpath-discovery
>>>
>>> Cheers,
>>> Till
>>>
>>> On Fri, Jun 15, 2018 at 2:22 PM Garvit Sharma <garvit...@gmail.com>
>>> wrote:
>>>
>>>> Does someone has any idea how to get rid if the above parse exception
>>>> while submitting flink job to Yarn.
>>>>
>>>> Already searched on the internet, could not find any solution to it.
>>>>
>>>> Please help.
>>>>
>>>> On Fri, Jun 15, 2018 at 9:15 AM Garvit Sharma <garvit...@gmail.com>
>>>> wrote:
>>>>
>>>>> Thanks Chesnay, Now it is connecting to the Resource Manager but I am
>>>>> getting the below exception :
>>>>>
>>>>> 2018-06-15 09:12:44,812 FATAL org.apache.hadoop.conf.Configuration
>>>>>                       - error parsing conf core-default.xml
>>>>>
>>>>> javax.xml.parsers.ParserConfigurationException: Feature '
>>>>> http://apache.org/xml/features/xinclude' is not recognized.
>>>>>
>>>>> at
>>>>> org.apache.xerces.jaxp.DocumentBuilderFactoryImpl.newDocumentBuilder(Unknown
>>>>> Source)
>>>>>
>>>>> at
>>>>> org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2737)
>>>>>
>>>>> at
>>>>> org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2696)
>>>>>
>>>>> at
>>>>> org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2579)
>>>>>
>>>>> at org.apache.hadoop.conf.Configuration.get(Configuration.java:1350)
>>>>>
>>>>> at
>>>>> org.apache.hadoop.yarn.factory.providers.RecordFactoryProvider.getRecordFactory(RecordFactoryProvider.java:49)
>>>>>
>>>>> at org.apache.hadoop.yarn.util.Records.<clinit>(Records.java:32)
>>>>>
>>>>> at
>>>>> org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getQueueInfoRequest(YarnClientImpl.java:584)
>>>>>
>>>>> at
>>>>> org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getAllQueues(YarnClientImpl.java:614)
>>>>>
>>>>> at
>>>>> org.apache.flink.yarn.AbstractYarnClusterDescriptor.checkYarnQueues(AbstractYarnClusterDescriptor.java:658)
>>>>>
>>>>> at
>>>>> org.apache.flink.yarn.AbstractYarnClusterDescriptor.deployInternal(AbstractYarnClusterDescriptor.java:486)
>>>>>
>>>>> at
>>>>> org.apache.flink.yarn.YarnClusterDescriptor.deployJobCluster(YarnClusterDescriptor.java:75)
>>>>>
>>>>> at
>>>>> org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:235)
>>>>>
>>>>> at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210)
>>>>>
>>>>> at
>>>>> org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020)
>>>>>
>>>>> at
>>>>> org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096)
>>>>>
>>>>> at java.security.AccessController.doPrivileged(Native Method)
>>>>>
>>>>> at javax.security.auth.Subject.doAs(Subject.java:422)
>>>>>
>>>>> at
>>>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1836)
>>>>>
>>>>> at
>>>>> org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)
>>>>>
>>>>> at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1096)
>>>>>
>>>>> 2018-06-15 09:12:44,825 WARN  
>>>>> org.apache.flink.yarn.AbstractYarnClusterDescriptor
>>>>>           - Error while getting queue information from YARN: null
>>>>>
>>>>> java.lang.NoClassDefFoundError: Could not initialize class
>>>>> org.apache.hadoop.yarn.util.Records
>>>>>
>>>>> at
>>>>> org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.createApplication(YarnClientImpl.java:230)
>>>>>
>>>>> at
>>>>> org.apache.flink.yarn.AbstractYarnClusterDescriptor.deployInternal(AbstractYarnClusterDescriptor.java:497)
>>>>>
>>>>> at
>>>>> org.apache.flink.yarn.YarnClusterDescriptor.deployJobCluster(YarnClusterDescriptor.java:75)
>>>>>
>>>>> at
>>>>> org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:235)
>>>>>
>>>>> at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210)
>>>>>
>>>>> at
>>>>> org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020)
>>>>>
>>>>> at
>>>>> org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096)
>>>>>
>>>>> at java.security.AccessController.doPrivileged(Native Method)
>>>>>
>>>>> at javax.security.auth.Subject.doAs(Subject.java:422)
>>>>>
>>>>> at
>>>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1836)
>>>>>
>>>>> at
>>>>> org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)
>>>>>
>>>>> at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1096)
>>>>>
>>>>> Please help.
>>>>>
>>>>> Thanks,
>>>>>
>>>>>
>>>>> On Thu, Jun 14, 2018 at 1:28 PM Chesnay Schepler <ches...@apache.org>
>>>>> wrote:
>>>>>
>>>>>> My gut feeling is that these classes must be present in jars in the
>>>>>> /lib directory. I don't think you can supply these with the submitted 
>>>>>> jar.
>>>>>> For a simple test, put your jar into the /lib folder before
>>>>>> submitting it.
>>>>>>
>>>>>> On 14.06.2018 06:56, Garvit Sharma wrote:
>>>>>>
>>>>>> Can someone please tell why am I facing this?
>>>>>>
>>>>>> On Wed, Jun 13, 2018 at 10:33 PM Garvit Sharma <garvit...@gmail.com>
>>>>>> wrote:
>>>>>>
>>>>>>> Hi,
>>>>>>>
>>>>>>> I am using *flink-1.5.0-bin-hadoop27-scala_2.11 *to submit jobs
>>>>>>> through Yarn, but I am getting the below exception :
>>>>>>>
>>>>>>> java.lang.NoClassDefFoundError:
>>>>>>> com/sun/jersey/core/util/FeaturesAndProperties
>>>>>>>
>>>>>>> at java.lang.ClassLoader.defineClass1(Native Method)
>>>>>>>
>>>>>>> at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
>>>>>>>
>>>>>>> at
>>>>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
>>>>>>>
>>>>>>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)
>>>>>>>
>>>>>>> at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
>>>>>>>
>>>>>>> at java.net.URLClassLoader$1.run(URLClassLoader.java:368)
>>>>>>>
>>>>>>> at java.net.URLClassLoader$1.run(URLClassLoader.java:362)
>>>>>>>
>>>>>>> at java.security.AccessController.doPrivileged(Native Method)
>>>>>>>
>>>>>>> at java.net.URLClassLoader.findClass(URLClassLoader.java:361)
>>>>>>>
>>>>>>> at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>>>>>>>
>>>>>>> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)
>>>>>>>
>>>>>>> at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>>>>>>>
>>>>>>> at
>>>>>>> org.apache.hadoop.yarn.client.api.TimelineClient.createTimelineClient(TimelineClient.java:55)
>>>>>>>
>>>>>>> at
>>>>>>> org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.createTimelineClient(YarnClientImpl.java:181)
>>>>>>>
>>>>>>> at
>>>>>>> org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.serviceInit(YarnClientImpl.java:168)
>>>>>>>
>>>>>>> at
>>>>>>> org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
>>>>>>>
>>>>>>> at
>>>>>>> org.apache.flink.yarn.cli.FlinkYarnSessionCli.getClusterDescriptor(FlinkYarnSessionCli.java:966)
>>>>>>>
>>>>>>> at
>>>>>>> org.apache.flink.yarn.cli.FlinkYarnSessionCli.createDescriptor(FlinkYarnSessionCli.java:269)
>>>>>>>
>>>>>>> at
>>>>>>> org.apache.flink.yarn.cli.FlinkYarnSessionCli.createClusterDescriptor(FlinkYarnSessionCli.java:444)
>>>>>>>
>>>>>>> at
>>>>>>> org.apache.flink.yarn.cli.FlinkYarnSessionCli.createClusterDescriptor(FlinkYarnSessionCli.java:92)
>>>>>>>
>>>>>>> at
>>>>>>> org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:221)
>>>>>>>
>>>>>>> at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210)
>>>>>>>
>>>>>>> at
>>>>>>> org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020)
>>>>>>>
>>>>>>> at
>>>>>>> org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096)
>>>>>>>
>>>>>>> at java.security.AccessController.doPrivileged(Native Method)
>>>>>>>
>>>>>>> at javax.security.auth.Subject.doAs(Subject.java:422)
>>>>>>>
>>>>>>> at
>>>>>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1754)
>>>>>>>
>>>>>>> at
>>>>>>> org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)
>>>>>>>
>>>>>>> at
>>>>>>> org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1096)
>>>>>>>
>>>>>>> Caused by: java.lang.ClassNotFoundException:
>>>>>>> com.sun.jersey.core.util.FeaturesAndProperties
>>>>>>>
>>>>>>> at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
>>>>>>>
>>>>>>> at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>>>>>>>
>>>>>>> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)
>>>>>>>
>>>>>>> at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>>>>>>>
>>>>>>>
>>>>>>> Command : HADOOP_CONF_DIR=/etc/hadoop/conf bin/flink run -m
>>>>>>> yarn-cluster -yd -yn 2 -ys 20 -yjm 10240 -ytm 10240 -yst -ynm test -yqu
>>>>>>> default -p 20 test.jar
>>>>>>>
>>>>>>> The class *com/sun/jersey/core/util/FeaturesAndProperties* is
>>>>>>> already present in the test.jar so not sure why am I getting this 
>>>>>>> exception.
>>>>>>>
>>>>>>> Please check and let me know.
>>>>>>>
>>>>>>> Thanks,
>>>>>>> --
>>>>>>>
>>>>>>> Garvit Sharma
>>>>>>> github.com/garvitlnmiit/
>>>>>>>
>>>>>>> No Body is a Scholar by birth, its only hard work and strong
>>>>>>> determination that makes him master.
>>>>>>>
>>>>>>
>>>>>>
>>>>>> --
>>>>>>
>>>>>> Garvit Sharma
>>>>>> github.com/garvitlnmiit/
>>>>>>
>>>>>> No Body is a Scholar by birth, its only hard work and strong
>>>>>> determination that makes him master.
>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>>> --
>>>>>
>>>>> Garvit Sharma
>>>>> github.com/garvitlnmiit/
>>>>>
>>>>> No Body is a Scholar by birth, its only hard work and strong
>>>>> determination that makes him master.
>>>>>
>>>>
>>>>
>>>> --
>>>>
>>>> Garvit Sharma
>>>> github.com/garvitlnmiit/
>>>>
>>>> No Body is a Scholar by birth, its only hard work and strong
>>>> determination that makes him master.
>>>>
>>>
>>
>> --
>>
>> Garvit Sharma
>> github.com/garvitlnmiit/
>>
>> No Body is a Scholar by birth, its only hard work and strong
>> determination that makes him master.
>>
>

-- 

Garvit Sharma
github.com/garvitlnmiit/

No Body is a Scholar by birth, its only hard work and strong determination
that makes him master.

Reply via email to