Hi,
I have created DFs on three Oracle tables. The join in Hive and Oracle are pretty simple SELECT t.calendar_month_desc, c.channel_desc, SUM(s.amount_sold) AS TotalSales FROM sales s, times t, channels c WHERE s.time_id = t.time_id AND s.channel_id = c.channel_id GROUP BY t.calendar_month_desc, c.channel_desc ; I try to do this using Data Framess import org.apache.spark.sql.functions._ val sqlContext = new org.apache.spark.sql.SQLContext(sc) // val s = sqlContext.load("jdbc", Map("url" -> "jdbc:oracle:thin:@rhes564:1521:mydb", "dbtable" -> "(sh.sales)", "user" -> "sh", "password" -> "xxxxx")) // val c = sqlContext.load("jdbc", Map("url" -> "jdbc:oracle:thin:@rhes564:1521:mydb", "dbtable" -> "(sh.channels)", "user" -> "sh", "password" -> "xxxxx")) val t = sqlContext.load("jdbc", Map("url" -> "jdbc:oracle:thin:@rhes564:1521:mydb", "dbtable" -> "(sh.times)", "user" -> "sh", "password" -> "xxxxx")) // val sc = s.join(c, s.col("CHANNEL_ID") === c.col("CHANNEL_ID")) val st = s.join(t, s.col("TIME_ID") === t.col("TIME_ID")) val rs = sc.join(st) rs.groupBy($"calendar_month_desc",$"channel_desc").agg(sum($"amount_sold")) The las result set (rs) does not work. Since data is imported then I assume that the columns for joins need to be defined in data frame for each table rather than importing the whole columns. Thanks, Mich Talebzadeh LinkedIn <https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABU rV8Pw> https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUr V8Pw <http://talebzadehmich.wordpress.com/> http://talebzadehmich.wordpress.com NOTE: The information in this email is proprietary and confidential. This message is for the designated recipient only, if you are not the intended recipient, you should destroy it immediately. Any information in this message shall not be understood as given or endorsed by Peridale Technology Ltd, its subsidiaries or their employees, unless expressly so stated. It is the responsibility of the recipient to ensure that this email is virus free, therefore neither Peridale Technology Ltd, its subsidiaries nor their employees accept any responsibility.