All of them should be "provided".
Pozdrawiam,
Jacek Laskowski
https://medium.com/@jaceklaskowski/
Mastering Apache Spark 2.0 http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski
On Sun, Aug 14, 2016 at 12:26 PM, Mich Talebzadeh
wrote:
> LOL
>
> well the issu
LOL
well the issue here was the dependencies scripted in that shell
script which was modified to add "provided" to it.
The script itself still works just the content of one of functions had to
be edited
function create_sbt_file {
SBT_FILE=${GEN_APPSDIR}/scala/${APPLICATION}/${FILE_NAME}.sbt
[ -f
Hi Mich,
Yeah, you don't have to worry about it...and that's why you're asking
these questions ;-)
Pozdrawiam,
Jacek Laskowski
https://medium.com/@jaceklaskowski/
Mastering Apache Spark 2.0 http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski
On Sun, Aug 14,
The magic does all that(including compiling and submitting with the jar
file. It is flexible as it does all this for any Sala program. it creates
sub-directories, compiles, submits etc so I don't have to worry about it.
HTH
Dr Mich Talebzadeh
LinkedIn *
https://www.linkedin.com/profile/view?i
Hi,
You should have all the deps being "provided" since they're provided
by spark infra after you spark-submit the uber-jar for the app.
What's the "magic" in local.ksh? Why don't you sbt assembly and do
spark-submit with the uber-jar?
Pozdrawiam,
Jacek Laskowski
https://medium.com/@jacekla
Thanks Jacek,
I thought there was some dependency issue. This did the trick
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.0.0"
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.0.0"
*libraryDependencies += "org.apache.spark" %% "spark-hive" % "2.0.0" %
"provided"*
I
Go to spark-shell and do :imports. You'll see all the imports and you
could copy and paste them in your app. (but there are not many
honestly and that won't help you much)
HiveContext lives in spark-hive. You don't need spark-sql and
spark-hive since the latter uses the former as a dependency (unl
The issue is on Spark shell this works OK
Spark context Web UI available at http://50.140.197.217:5
Spark context available as 'sc' (master = local, app id =
local-1471191662017).
Spark session available as 'spark'.
Welcome to
__
/ __/__ ___ _/ /__
_\ \/ _
HiveContext is gone
SparkSession now combines functionality of SqlContext and HiveContext (if
hive support is available)
On Sun, Aug 14, 2016 at 12:12 PM, Mich Talebzadeh wrote:
> Thanks Koert,
>
> I did that before as well. Anyway this is dependencies
>
> libraryDependencies += "org.apache.spa
Thanks Koert,
I did that before as well. Anyway this is dependencies
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.0.0"
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.0.0"
libraryDependencies += "org.apache.spark" %% "spark-hive" % "2.0.0"
and the error
[info]
you cannot mix spark 1 and spark 2 jars
change this
libraryDependencies += "org.apache.spark" %% "spark-hive" % "1.5.1"
to
libraryDependencies += "org.apache.spark" %% "spark-hive" % "2.0.0"
On Sun, Aug 14, 2016 at 11:58 AM, Mich Talebzadeh wrote:
> Hi,
>
> In Spark 2 I am using sbt or mvn to c
Hi,
In Spark 2 I am using sbt or mvn to compile my scala program. This used to
compile and run perfectly with Spark 1.6.1 but now it is throwing error
I believe the problem is here. I have
name := "scala"
version := "1.0"
scalaVersion := "2.11.7"
libraryDependencies += "org.apache.spark" %% "sp
12 matches
Mail list logo