Hi, 
Recently I am planning to use spark sql to run some tests over large mysql 
datatable, and trying to 
compare the performance between spark and mycat. However, the load is super 
slow and hope someone 
can help tune on this. 
Environment: Spark 1.4.1  
Code snipet: 
       val prop = new java.util.Properties
       prop.setProperty("user","root")
       prop.setProperty("password", "123456")
    
       val url1 = "jdbc:mysql://localhost:3306/db1"
       val jdbcDF = sqlContext.read.jdbc(url1,"video",prop)
       jdbcDF.registerTempTable("video_test")
       sqlContext.sql("select count(1) from video_test").show()

Overally the load process would stuck and get connection timeout. Mysql table 
hold about 100 million records.
Would be happy to provide more usable info. 

Best,
Sun.
   



fightf...@163.com

Reply via email to