Hi Guys, 

I’m having a issue loading data with a jdbc connector
My line of code is : 

val df = 
sqlContext.read.format("jdbc").option("url","jdbc:mysql://localhost:3306 
<mysql://localhost:3306>").option("driver","com.mysql.jdbc.Driver").option("dbtable”,"sometable").option("user”,"superuser").option("password”,"supersafe").option("partitionColumn","id").option("lowerBound","1325376000").option("upperBound","1483228800").option("numPartitions","20").load()

when I do : df.filter("last_value IS NOT NULL ").filter("utc_timestamp <=  
1347369839").count 
or df.filter("last_value IS NOT NULL ").filter(“ `utc_timestamp` <=  
1347369839").count

on the mysql logs I see : SELECT `last_value`,`utc_timestamp` FROM sometable 
WHERE utc_timestamp <= 1347369839 AND id >= 1451658240 AND id < 145955088

the problem is that utc_timestamp is a function on mysql and it gets executed. 
How can I force Spark  to not remove the `` on the where clauses ? 




Jorge Machado
www.jmachado.me <http://www.jmachado.me/>

Jorge Machado
www.jmachado.me





Reply via email to