You can use spark dataframe 'when' 'otherwise' clause to replace SQL case
statement.
This piece will be required to calculate before -
'select student_id from tbl_student where candidate_id = c.candidate_id and
approval_id = 2
and academic_start_date is null'
Take the count of above DF after joi
Dear Spark Users,
I came across little weird MSSQL Query to replace with Spark and I am like
no clue how to do it in an efficient way with Scala + SparkSQL. Can someone
please throw light. I can create view of DataFrame and do it as
*spark.sql *(query)
but I would like to do it with Scala + Spark