java.lang.IllegalArgumentException: Cannot add dependency
'org.apache.spark#spark-core_2.10;1.5.1' to configuration 'tests' of module
cep_assembly#cep_assembly_2.10;1.0 because this configuration doesn't exist!

Dr Mich Talebzadeh



LinkedIn * 
https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
<https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*



http://talebzadehmich.wordpress.com



On 22 April 2016 at 16:37, Ted Yu <[email protected]> wrote:

> For SparkFunSuite , add the following:
>
> libraryDependencies += "org.apache.spark" % "spark-core_2.10" % "1.5.1" %
> "tests"
>
> On Fri, Apr 22, 2016 at 7:20 AM, Mich Talebzadeh <
> [email protected]> wrote:
>
>> Trying to build with sbt with the following dependencies
>>
>> libraryDependencies += "org.apache.spark" %% "spark-core" % "1.5.1" %
>> "provided"
>> libraryDependencies += "org.apache.spark" %% "spark-sql" % "1.5.1"  %
>> "provided"
>> libraryDependencies += "org.apache.spark" %% "spark-hive" % "1.5.1" %
>> "provided"
>> libraryDependencies += "junit" % "junit" % "4.12"
>> libraryDependencies += "org.scala-sbt" % "test-interface" % "1.0"
>> libraryDependencies += "org.apache.spark" %% "spark-streaming" % "1.6.1"
>> % "provided"
>> libraryDependencies += "org.apache.spark" %% "spark-streaming-kafka" %
>> "1.6.1"
>> libraryDependencies += "org.scalactic" %% "scalactic" % "2.2.6"
>> libraryDependencies += "org.scalatest" %% "scalatest" % "2.2.6"
>> libraryDependencies += "org.apache.spark" % "spark-core_2.10" % "1.5.1"
>> libraryDependencies += "org.apache.spark" %
>> "spark-streaming-kafka-assembly_2.10" % "1.6.1"
>>
>>
>> [error]
>> /data6/hduser/scala/CEP_assembly/src/main/scala/myPackage/CEP_assemly.scala:35:
>> object SparkFunSuite is not a member of package org.apache.spark
>> [error] import org.apache.spark.{SparkConf, SparkContext, SparkFunSuite}
>> [error]        ^
>> [error]
>> /data6/hduser/scala/CEP_assembly/src/main/scala/myPackage/CEP_assemly.scala:47:
>> not found: type SparkFunSuite
>> [error]   extends SparkFunSuite
>> [error]           ^
>> [error]
>> /data6/hduser/scala/CEP_assembly/src/main/scala/myPackage/CEP_assemly.scala:88:
>> package test is not a value
>> [error]   test("basic stream receiving with multiple topics and smallest
>> starting offset") {
>> [error]   ^
>> [error]
>> /data6/hduser/scala/CEP_assembly/src/main/scala/myPackage/CEP_assemly.scala:146:
>> package test is not a value
>> [error]   test("receiving from largest starting offset") {
>> [error]   ^
>> [error]
>> /data6/hduser/scala/CEP_assembly/src/main/scala/myPackage/CEP_assemly.scala:191:
>> package test is not a value
>> [error]   test("creating stream by offset") {
>> [error]   ^
>> [error]
>> /data6/hduser/scala/CEP_assembly/src/main/scala/myPackage/CEP_assemly.scala:237:
>> package test is not a value
>> [error]   test("offset recovery") {
>> [error]   ^
>> [error]
>> /data6/hduser/scala/CEP_assembly/src/main/scala/myPackage/CEP_assemly.scala:319:
>> package test is not a value
>> [error]   test("Direct Kafka stream report input information") {
>> [error]   ^
>> [error]
>> /data6/hduser/scala/CEP_assembly/src/main/scala/myPackage/CEP_assemly.scala:358:
>> package test is not a value
>> [error]   test("maxMessagesPerPartition with backpressure disabled") {
>> [error]   ^
>> [error]
>> /data6/hduser/scala/CEP_assembly/src/main/scala/myPackage/CEP_assemly.scala:367:
>> package test is not a value
>> [error]   test("maxMessagesPerPartition with no lag") {
>> [error]   ^
>> [error]
>> /data6/hduser/scala/CEP_assembly/src/main/scala/myPackage/CEP_assemly.scala:376:
>> package test is not a value
>> [error]   test("maxMessagesPerPartition respects max rate") {
>> [error]   ^
>> [error]
>> /data6/hduser/scala/CEP_assembly/src/main/scala/myPackage/CEP_assemly.scala:386:
>> package test is not a value
>> [error]   test("using rate controller") {
>> [error]   ^
>> [error]
>> /data6/hduser/scala/CEP_assembly/src/main/scala/myPackage/CEP_assemly.scala:531:
>> object WindowState is not a member of package
>> org.apache.spark.streaming.dstream
>> [error] import org.apache.spark.streaming.dstream.WindowState
>> [error]        ^
>> [error]
>> /data6/hduser/scala/CEP_assembly/src/main/scala/myPackage/CEP_assemly.scala:578:
>> not found: type WindowState
>> [error]     def rise(in: Tick, ew: WindowState): Boolean = {
>> [error]                            ^
>> [error]
>> /data6/hduser/scala/CEP_assembly/src/main/scala/myPackage/CEP_assemly.scala:582:
>> not found: type WindowState
>> [error]     def drop(in: Tick, ew: WindowState): Boolean = {
>> [error]                            ^
>> [error]
>> /data6/hduser/scala/CEP_assembly/src/main/scala/myPackage/CEP_assemly.scala:586:
>> not found: type WindowState
>> [error]     def deep(in: Tick, ew: WindowState): Boolean = {
>> [error]                            ^
>> [error]
>> /data6/hduser/scala/CEP_assembly/src/main/scala/myPackage/CEP_assemly.scala:592:
>> not found: type WindowState
>> [error]     val predicateMapping: Map[String, (Tick, WindowState) =>
>> Boolean] =
>> [error]                                              ^
>> [error]
>> /data6/hduser/scala/CEP_assembly/src/main/scala/myPackage/CEP_assemly.scala:595:
>> value patternMatchByKeyAndWindow is not a member of
>> org.apache.spark.streaming.dstream.DStream[(String, Tick)]
>> [error]     val matches = kvTicks.patternMatchByKeyAndWindow("rise drop
>> [rise ]+ deep".r,
>> [error]                           ^
>> [error]
>> /data6/hduser/scala/CEP_assembly/src/main/scala/myPackage/CEP_assemly.scala:646:
>> not found: type WindowState
>> [error]     def rise(in: Tick, ew: WindowState): Boolean = {
>> [error]                            ^
>> [error]
>> /data6/hduser/scala/CEP_assembly/src/main/scala/myPackage/CEP_assemly.scala:650:
>> not found: type WindowState
>> [error]     def drop(in: Tick, ew: WindowState): Boolean = {
>> [error]                            ^
>> [error]
>> /data6/hduser/scala/CEP_assembly/src/main/scala/myPackage/CEP_assemly.scala:654:
>> not found: type WindowState
>> [error]     def deep(in: Tick, ew: WindowState): Boolean = {
>> [error]                            ^
>> [error]
>> /data6/hduser/scala/CEP_assembly/src/main/scala/myPackage/CEP_assemly.scala:660:
>> not found: type WindowState
>> [error]     val predicateMapping: Map[String, (Tick, WindowState) =>
>> Boolean] =
>> [error]                                              ^
>> [error]
>> /data6/hduser/scala/CEP_assembly/src/main/scala/myPackage/CEP_assemly.scala:663:
>> value patternMatchByWindow is not a member of
>> org.apache.spark.streaming.dstream.DStream[(Long, Tick)]
>> [error]     val matches = kvTicks.patternMatchByWindow("rise drop [rise
>> ]+ deep".r,
>> [error]                           ^
>> [error] 22 errors found
>> [error] (compile:compileIncremental) Compilation failed
>>
>>
>> Dr Mich Talebzadeh
>>
>>
>>
>> LinkedIn * 
>> https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
>> <https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*
>>
>>
>>
>> http://talebzadehmich.wordpress.com
>>
>>
>>
>> On 22 April 2016 at 14:53, Ted Yu <[email protected]> wrote:
>>
>>> Normally Logging would be included in spark-shell session since
>>> spark-core jar is imported by default:
>>>
>>> scala> import org.apache.spark.internal.Logging
>>> import org.apache.spark.internal.Logging
>>>
>>> See this JIRA:
>>>
>>> [SPARK-13928] Move org.apache.spark.Logging into
>>> org.apache.spark.internal.Logging
>>>
>>> In 1.6.x release, Logging was at org.apache.spark.Logging
>>>
>>> FYI
>>>
>>> On Fri, Apr 22, 2016 at 12:21 AM, Mich Talebzadeh <
>>> [email protected]> wrote:
>>>
>>>>
>>>> Hi,
>>>>
>>>> Anyone know which jar file has  import
>>>> org.apache.spark.internal.Logging?
>>>>
>>>> I tried *spark-core_2.10-1.5.1.jar *
>>>>
>>>> but does not seem to work
>>>>
>>>> scala> import org.apache.spark.internal.Logging
>>>>
>>>> <console>:57: error: object internal is not a member of package
>>>> org.apache.spark
>>>>          import org.apache.spark.internal.Logging
>>>>
>>>> Thanks
>>>>
>>>> Dr Mich Talebzadeh
>>>>
>>>>
>>>>
>>>> LinkedIn * 
>>>> https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
>>>> <https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*
>>>>
>>>>
>>>>
>>>> http://talebzadehmich.wordpress.com
>>>>
>>>>
>>>>
>>>
>>>
>>
>

Reply via email to