Just use the .format('jdbc') data source? This is built in, for all
languages. You can get an RDD out if you must.On Mon, Sep 19, 2022, 5:28 AM [email protected] <[email protected]> wrote: > Thank you answer alton. > > But i see that is use scala to implement it. > I know java/scala can get data from mysql using JDBCRDD farily well. > But i want to get same way in Python Spark. > > Would you to give me more advice, very thanks to you. > > > ------------------------------ > [email protected] > > > *发件人:* Xiao, Alton <[email protected]> > *发送时间:* 2022-09-19 18:04 > *收件人:* [email protected]; [email protected] > *主题:* 答复: [how to]RDD using JDBC data source in PySpark > > Hi javacaoyu: > > https://hevodata.com/learn/spark-mysql/#Spark-MySQL-Integration > > I think spark have already integrated mysql > > > > *发件人**:* [email protected] <[email protected]> > *日期**:* 星期一, 2022年9月19日 17:53 > *收件人**:* [email protected] <[email protected]> > *主题**:* [how to]RDD using JDBC data source in PySpark > > 你通常不会收到来自 [email protected] 的电子邮件。了解这一点为什么很重要 > <https://aka.ms/LearnAboutSenderIdentification> > > Hi guys: > > > > Does have some way to let rdd can using jdbc data source in pyspark? > > > > i want to get data from mysql, but in PySpark, there is not supported > JDBCRDD like java/scala. > > and i search docs from web site, no answer. > > > > > > So i need your guys help, Thank you very much. > > > ------------------------------ > > [email protected] > >
