for me it works, with out making any other change, try importing
import sqlContext.implicits._
otherwise verify if u are able to run other functions or u have some issue
with ur setup.

Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 1.6.1
      /_/

Using Scala version 2.10.5 (Java HotSpot(TM) 64-Bit Server VM, Java
1.8.0_91)
Type in expressions to have them evaluated.
Type :help for more information.
Spark context available as sc.
16/04/27 16:07:23 WARN Connection: BoneCP specified but not present in
CLASSPATH (or one of dependencies)
16/04/27 16:07:23 WARN Connection: BoneCP specified but not present in
CLASSPATH (or one of dependencies)
16/04/27 16:07:29 WARN ObjectStore: Version information not found in
metastore. hive.metastore.schema.verification is not enabled so recording
the schema version 1.2.0
16/04/27 16:07:29 WARN ObjectStore: Failed to get database default,
returning NoSuchObjectException
16/04/27 16:07:29 WARN : Your hostname, sachins-MacBook-Pro-2.local
resolves to a loopback/non-reachable address:
fe80:0:0:0:288b:f5ff:fe80:367e%awdl0, but we couldn't find any external IP
address!
16/04/27 16:07:36 WARN Connection: BoneCP specified but not present in
CLASSPATH (or one of dependencies)
16/04/27 16:07:36 WARN Connection: BoneCP specified but not present in
CLASSPATH (or one of dependencies)
SQL context available as sqlContext.

scala>

scala> val ds = Seq(1, 2, 3).toDS()
ds: org.apache.spark.sql.Dataset[Int] = [value: int]

scala> ds.map(_ + 1).collect() // Returns: Array(2, 3, 4)
res0: Array[Int] = Array(2, 3, 4)

On Wed, Apr 27, 2016 at 4:01 PM, shengshanzhang <shengshanzh...@icloud.com>
wrote:

> 1.6.1
>
> 在 2016年4月27日,下午6:28,Sachin Aggarwal <different.sac...@gmail.com> 写道:
>
> what is ur spark version?
>
> On Wed, Apr 27, 2016 at 3:12 PM, shengshanzhang <shengshanzh...@icloud.com
> > wrote:
>
>> Hi,
>>
>> On spark website, there is code as follows showing how to create datasets.
>> <PastedGraphic-2.tiff>
>>
>> However when i input this line into spark-shell,there comes a Error, and
>> who can tell me Why and how to fix this?
>>
>> scala> val ds = Seq(1, 2, 3).toDS()
>> <console>:35: error: value toDS is not a member of Seq[Int]
>>        val ds = Seq(1, 2, 3).toDS()
>>
>>
>>
>> Thank you a lot!
>>
>
>
>
> --
>
> Thanks & Regards
>
> Sachin Aggarwal
> 7760502772
>
>
>


-- 

Thanks & Regards

Sachin Aggarwal
7760502772

Reply via email to