Hi ,
Thanks all, its working fine the issue is with some space for the dept id,

I have one more doubt for the non matching records its showing null word,
even if i write into HDFS also its showing null word how can we avoid
writing null for the non matching columns, i want just empty value ("")

same input i used in the dept table i removed the last row and the below
code i used to write into hdfs.

DataFrame joinResult = sqlContext.sql("SELECT * FROM EMP e LEFT OUTER JOIN
DEPT d ON e.deptid = d.deptid");
                   joinResult.javaRDD().repartition(1).map(new Function<Row, 
String>() {
                private static final long serialVersionUID = 
9185646063977504742L;
                        @Override
                        public String call(Row arg0) throws Exception {  
                                String s;                               
                           
s=arg0.getString(0)+"\u001c"+arg0.getString(1)+"\u001c"+arg0.getString(2)+"\u001c"+arg0.getString(3)+"\u001c"+arg0.getString(4)+"\u001e";
                                return s;
                        }
                }).saveAsTextFile(args[2]);


Output in HDFS File

10 1001 aba 10 dev
10 1003 abd 10 dev
10 1005 abg 10 dev
10 1007 abj 10 dev
10 1010 abq 10 dev
20 1002 abs 20 Test
20 1006 abh 20 Test
20 1009 abl 20 Test
30 1004 abf null null
30 1008 abk null null

in my case i want to store the join result back to data base table and its
storing "null" word for those non matching records, i want to store as
""(empty value) for the non matching rows.



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Left-outer-Join-issue-using-programmatic-sql-joins-tp27295p27299.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to