shown here:
>>
>>
>>
>>
>> https://spark.apache.org/docs/latest/sql-programming-guide.html#programmatically-specifying-the-schema
>>
>>
>>
>> Mohammed
>>
>>
>>
>> *From:* Krishna Sankar [mailto:ksanka...@gmail.com]
>>
atically-specifying-the-schema
Mohammed
From: Krishna Sankar [mailto:ksanka...@gmail.com<mailto:ksanka...@gmail.com>]
Sent: Wednesday, July 1, 2015 3:09 PM
To: Hafiz Mujadid
Cc: user@spark.apache.org<mailto:user@spark.apache.org>
Subject: Re: making dataframe for different types us
1, 2015 3:09 PM
> *To:* Hafiz Mujadid
> *Cc:* user@spark.apache.org
> *Subject:* Re: making dataframe for different types using spark-csv
>
>
>
> · use .cast("...").alias('...') after the DataFrame is read.
>
> · sql.functions.udf for any domain-specific co
Mohammed
From: Krishna Sankar [mailto:ksanka...@gmail.com]
Sent: Wednesday, July 1, 2015 3:09 PM
To: Hafiz Mujadid
Cc: user@spark.apache.org
Subject: Re: making dataframe for different types using spark-csv
· use .cast("...").alias('...') after the DataFrame is read.
· sql.f
- use .cast("...").alias('...') after the DataFrame is read.
- sql.functions.udf for any domain-specific conversions.
Cheers
On Wed, Jul 1, 2015 at 11:03 AM, Hafiz Mujadid
wrote:
> Hi experts!
>
>
> I am using spark-csv to lead csv data into dataframe. By default it makes
> type of each