FWIW, although the underlying Hadoop declared guava dependency is pretty low, 
everything in org.apache.hadoop is set up to run against later versions. It 
just sticks with the old one to avoid breaking anything donwstream which does 
expect a low version number. See HADOOP-10101 for the ongoing pain there —and 
complain on there if you do find something in the Hadoop layer which can't 
handle later guava versions.




On 16 Dec 2016, at 11:07, Sean Owen 
<so...@cloudera.com<mailto:so...@cloudera.com>> wrote:

Yes, that's the problem. Guava isn't generally mutually compatible across more 
than a couple major releases. You may have to hunt for a version that happens 
to have the functionality that both dependencies want, and hope that exists. 
Spark should shade Guava at this point but doesn't mean that you won't hit this 
problem from transitive dependencies.

On Fri, Dec 16, 2016 at 11:05 AM kant kodali 
<kanth...@gmail.com<mailto:kanth...@gmail.com>> wrote:
I replaced guava-14.0.1.jar  with guava-19.0.jar in SPARK_HOME/jars and seem to 
work ok but I am not sure if it is the right thing to do. My fear is that if 
Spark uses features from Guava that are only present in 14.0.1 but not in 19.0 
I guess my app will break.



On Fri, Dec 16, 2016 at 2:22 AM, kant kodali 
<kanth...@gmail.com<mailto:kanth...@gmail.com>> wrote:
Hi Guys,

Here is the simplified version of my problem. I have the following problem and 
I new to gradle



dependencies {
    compile group: 'org.apache.spark', name: 'spark-core_2.11', version: '2.0.2'
    compile group: 'com.github.brainlag', name: 'nsq-client', version: 
'1.0.0.RC2'
}

I took out the other dependencies for simplicity. The problem here is 
spark-core_2.11 uses com.google.guava:14.0.1 and nsq-client uses 
com.google.guava:19.0 so when I submit my fat uber jar using spark-submit I get 
the following error


Exception in thread "main" java.lang.NoSuchMethodError: 
com.google.common.collect.Sets.newConcurrentHashSet()Ljava/util/Set;
    at com.github.brainlag.nsq.NSQProducer.(NSQProducer.java:22)
    at com.hello.streamprocessing.app.SparkDriver2.main(SparkDriver2.java:37)

any help would be great. Also if you need more description you can find it 
here<http://stackoverflow.com/questions/41003416/how-to-solve-nomethoderror-that-arises-due-to-using-a-same-library-with-two-diff>

Thanks!



Reply via email to