Just use the .format('jdbc') data source? This is built in, for all
languages. You can get an RDD out if you must.

On Mon, Sep 19, 2022, 5:28 AM javaca...@163.com <javaca...@163.com> wrote:

> Thank you answer alton.
>
> But i see that is use scala to implement it.
> I know java/scala can get data from mysql using JDBCRDD farily well.
> But i want to get same way in Python Spark.
>
> Would you to give me more advice, very thanks to you.
>
>
> ------------------------------
> javaca...@163.com
>
>
> *发件人:* Xiao, Alton <alton.x...@sap.com.INVALID>
> *发送时间:* 2022-09-19 18:04
> *收件人:* javaca...@163.com; user@spark.apache.org
> *主题:* 答复: [how to]RDD using JDBC data source in PySpark
>
> Hi javacaoyu:
>
> https://hevodata.com/learn/spark-mysql/#Spark-MySQL-Integration
>
> I think spark have already integrated mysql
>
>
>
> *发件人**:* javaca...@163.com <javaca...@163.com>
> *日期**:* 星期一, 2022年9月19日 17:53
> *收件人**:* user@spark.apache.org <user@spark.apache.org>
> *主题**:* [how to]RDD using JDBC data source in PySpark
>
> 你通常不会收到来自 javaca...@163.com 的电子邮件。了解这一点为什么很重要
> <https://aka.ms/LearnAboutSenderIdentification>
>
> Hi guys:
>
>
>
>     Does have some way to let rdd can using jdbc data source in pyspark?
>
>
>
>     i want to get data from mysql, but in PySpark, there is not supported
> JDBCRDD like java/scala.
>
>     and i search docs from web site, no answer.
>
>
>
>
>
>     So i need your guys help,  Thank you very much.
>
>
> ------------------------------
>
> javaca...@163.com
>
>

Reply via email to