I thought, we can use  sqlContext.sql("some join query") API with jdbc,
that's why I have asked the above question.

But as we can only use
sqlContext.read().format("jdbc").options(options).load() and here we can
use actual join query of ORACLE source.

So this question is not valid. Please ignore it.

Thanks & Regards,
B Anil Kumar.

On Tue, Oct 25, 2016 at 2:35 PM, AnilKumar B <akumarb2...@gmail.com> wrote:

> Hi,
>
> I am using Spark SQL to transform data. My Source is ORACLE, In general, I
> am extracting multiple tables and joining them and then doing some other
> transformations in Spark.
>
> Is there any possibility for pushing down join operator to ORACLE using
> SPARK SQL, instead of fetching and joining in Spark? I am unable find any
> options for these optimizations rules at https://spark.apache.org/docs/
> 1.6.0/sql-programming-guide.html#jdbc-to-other-databases.
>
> I am currently using spark-1.6 version.
>
> Thanks & Regards,
> B Anil Kumar.
>

Reply via email to