For your case, I think you can use a trick for separating with “ “,” " instead 
of “,”
You can refer the following code snippet

val people = sc.textFile("examples/src/main/resources/data.csv").map( x => 
x.substring(1,x.length-1).split("\",\"")).map(p => List(p(0), p(1), p(2)))


On Feb 19, 2015, at 10:02 PM, Francesco Bonamente 
<francescoboname...@gmail.com> wrote:

> Hi Yanbo,
> unfortunately all csv files contain comma inside some columns and I can't 
> change the structure.
> 
> How can I work with this kind of textfile and spark-sql?
>  
> Thank you again
> 
> 
> 2015-02-19 14:38 GMT+01:00 Yanbo Liang <hackingda...@gmail.com>:
> This is because of each line will be separated into 4 columns instead of 3 
> columns.
> If you want to use comma to separate different columns, each column will be 
> not allowed to include commas.  
> 
> 
> 2015-02-19 18:12 GMT+08:00 sparkino <francescoboname...@gmail.com>:
> Hello everybody,
> I'm quite new to Spark and Scala as well and I was trying to analyze some
> csv data via spark-sql
> 
> My csv file contains data like this
> 
> 
> 
> Following the example at this link below
> https://spark.apache.org/docs/latest/sql-programming-guide.html#inferring-the-schema-using-reflection
> 
> I have that code:
> 
> 
> When I try to fetch "val name_address" the second field (address) does not
> contain all the text. The issue is probably the comma, it is at the same
> time the split character and a string value of the third column (address) of
> the csv file.
> 
> How can handle this?
> 
> Thank you in advance
> Sparkino
> 
> 
> 
> --
> View this message in context: 
> http://apache-spark-user-list.1001560.n3.nabble.com/spark-sql-problem-with-textfile-separator-tp21718.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
> 
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
> 
> 
> 

Reply via email to