Hi ,try this way
val df = sales_demand.join(product_master,sales_demand("INVENTORY_ITEM_ID") ===
product_master("INVENTORY_ITEM_ID"),"inner")
> 在 2016年8月30日,下午5:52,Jacek Laskowski 写道:
>
> Hi Mich,
>
> This is the first time I've been told about $ for string interpolation (as
> the function n
1.6.1
> 在 2016年4月27日,下午6:28,Sachin Aggarwal 写道:
>
> what is ur spark version?
>
> On Wed, Apr 27, 2016 at 3:12 PM, shengshanzhang <mailto:shengshanzh...@icloud.com>> wrote:
> Hi,
>
> On spark website, there is code as follows showing how to create
&
Hi,
On spark website, there is code as follows showing how to create
datasets.
However when i input this line into spark-shell,there comes a Error,
and who can tell me Why and how to fix this?
scala> val ds = Seq(1, 2, 3).toDS()
:35: error: value toDS is not a member of Seq[I
ot; % "test"
> libraryDependencies += "org.apache.spark" %% "spark-core" % "1.6.1" %
> "provided"
> libraryDependencies += "org.apache.spark" %% "spark-streaming" % &qu
quot;org.apache.spark" %% "spark-sql" % "1.6.1"
~
> 在 2016年4月27日,下午4:58,ramesh reddy <mailto:ramesh_sre...@yahoo.co.in>> 写道:
>
> Spark Sql jar has to be added as a dependency in build.sbt.
>
>
> On Wednesday, 27 April
quot;org.apache.spark" %% "spark-sql" % "1.6.1"
~
> 在 2016年4月27日,下午4:58,ramesh reddy 写道:
>
> Spark Sql jar has to be added as a dependency in build.sbt.
>
>
> On Wednesday, 27 April 2016 1:57 PM, shengshanzhang
> wrote:
>
>
> Hel
Hello :
my code is as follows:
---
import org.apache.spark.{SparkConf, SparkContext}
import org.apache.spark.sql.SQLContext
case class Record(key: Int, value: String)
object RDDRelation {
def main(args: Arr