I hope it was not too late :).

It is possible.

Please check csvRdd api here,
https://github.com/databricks/spark-csv/blob/master/src/main/scala/com/databricks/spark/csv/CsvParser.scala#L150
.

Thanks!
On 2 Apr 2016 2:47 a.m., "Benjamin Kim" <bbuil...@gmail.com> wrote:

> Does anyone know if this is possible? I have an RDD loaded with rows of
> CSV data strings. Each string representing the header row and multiple rows
> of data along with delimiters. I would like to feed each thru a CSV parser
> to convert the data into a dataframe and, ultimately, UPSERT a Hive/HBase
> table with this data.
>
> Please let me know if you have any ideas.
>
> Thanks,
> Ben
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to