fly-cutter 16:00
After packaging several times just now, the hadoop version of this shade module 
may also be defined in the shade package, and the version variable cannot be 
changed into other versions, otherwise the package of shade will contain hadoop 
of two client versions. It is not clear what the principle is



fly-cutter 16:05
Since the parent pom has been changed, the child module wants to override the 
same dependency of the parent pom, and the version variable needs to be the same


peacewong@WDS 16:20
That's a problem




fly-cutter 16:22
I will try to solve this problem. Maybe we need to take a step back. shade 
hadoop-hdfs or shade hadoop-client is not available.




peacewong@WDS

done

Xu Ling 16:33

If only hadoop-hdfs is used, this problem can be solved by variables




Spark2 should only use client functionality. Hadoop2.8 doesn't have it




fly-cutter
Well, try it later




I've tried it out and only shade hdfs uses a variable control artifact to solve 
the problem of shade versions not being passed in from outside




peacewong@WDS
Hi fly-cutter, impressive




fly-cutter
You are welcome. It is only under the guidance of the leaders. The code may 
still have some optimization, I will look at it later



peacewong@WDS
Cool. I learned that


小郭飞飞刀 16:00
刚才打包了几次,这个shade模块的hadoop版本可能还要在shade包中定义,而且版本变量不能换成其他的,否则shade打的包会含有两个client版本的hadoop,暂时还不清楚什么原理




小郭飞飞刀 16:05
估计和pom的声明继承有关系有关系,parent pom改动之后,子模块想覆盖父pom的同一个依赖,需要版本变量也一致,醉了




和平 16:20
这就麻烦了




小郭飞飞刀 16:22
这个我来尝试解决下,可能需要退一步shade hadoop-hdfs,不shade hadoop-client,今天要开一天车,晚点再研究下。





和平

好了





许灵 16:33
如果只是 hadoop-hdfs,这个问题,可以通过变量解决




Spark2应该只用到了client 的功能,Hadoop2.8 就没这个包了





小郭飞飞刀
恩,回头试下




我试了下,只shade hdfs通过变量控制artifact可以解决shade版本不能从外部传入问题



和平
郭飞兄,厉害





小郭飞飞刀
没有没有,在各位大佬的指引下才有方向,代码可能还有优化的地方,我晚点再看看




和平
厉害,我学到了

Reply via email to