I was fix it by :
val jdbcDF =
spark.read.format("org.apache.spark.sql.execution.datasources.jdbc.DefaultSource")
      .options(Map("url" -> s"jdbc:mysql://${mysqlhost}:3306/test",
"driver" -> "com.mysql.jdbc.Driver", "dbtable" -> "i_user", "user" ->
"root", "password" -> "123456"))
      .load()

where org.apache.spark.sql.execution.datasources.jdbc.DefaultSource and
org.apache.spark.sql.execution.datasources.jdbc.JdbcRelationProvider both
have the same short name:jdbc

2016-08-01 15:24 GMT+08:00 Nikolay Zhebet <phpap...@gmail.com>:

> You should specify classpath for your jdbc connection.
> As example, if you want connect to Impala, you can try it snippet:
>
>
>
> import java.util.Properties
> import org.apache.spark._
> import org.apache.spark.sql.SQLContext
> import java.sql.Connection
> import java.sql.DriverManager
> Class.forName("com.cloudera.impala.jdbc41.Driver")
>
> var conn: java.sql.Connection = null
> conn = 
> DriverManager.getConnection("jdbc:impala://127.0.0.1:21050/default;auth=noSasl",
>  "", "")
> val statement = conn.createStatement();
>
> val result = statement.executeQuery("SELECT * FROM users limit 10")
> result.next()
> result.getString("user_id")val sql_insert = "INSERT INTO users 
> VALUES('user_id','email','gender')"
> statement.executeUpdate(sql_insert)
>
>
> Also you should specify path your jdbc jar file in --driver-class-path
> variable when you running spark-submit:
>
> spark-shell --master "local[2]" --driver-class-path 
> /opt/cloudera/parcels/CDH/jars/ImpalaJDBC41.jar
>
>
> 2016-08-01 9:37 GMT+03:00 kevin <kiss.kevin...@gmail.com>:
>
>> maybe there is another version spark on the classpath?
>>
>> 2016-08-01 14:30 GMT+08:00 kevin <kiss.kevin...@gmail.com>:
>>
>>> hi,all:
>>>    I try to load data from jdbc datasource,but I got error with :
>>> java.lang.RuntimeException: Multiple sources found for jdbc
>>> (org.apache.spark.sql.execution.datasources.jdbc.JdbcRelationProvider,
>>> org.apache.spark.sql.execution.datasources.jdbc.DefaultSource), please
>>> specify the fully qualified class name.
>>>
>>> spark version is 2.0
>>>
>>>
>>
>

Reply via email to