Can you write a script to download and install the JDBC driver to the local
Maven repository if it's not already present? If we had that, we could just
invoke it as part of dev/run-tests.
On Thu, Dec 3, 2015 at 5:55 PM Luciano Resende wrote:
>
>
> On Mon, Nov 30, 2015 at 1:53 PM, Josh Rosen
> w
On Mon, Nov 30, 2015 at 1:53 PM, Josh Rosen
wrote:
> The JDBC drivers are currently being pulled in as test-scope dependencies
> of the `sql/core` module:
> https://github.com/apache/spark/blob/f2fbfa444f6e8d27953ec2d1c0b3abd603c963f9/sql/core/pom.xml#L91
>
> In SBT, these wind up on the Docker J
On Mon, Nov 30, 2015 at 10:53 PM, Josh Rosen wrote:
> In SBT, these wind up on the Docker JDBC tests' classpath as a transitive
> dependency of the `spark-sql` test JAR. However, what we should be doing is
> adding them as explicit test dependencies of the `docker-integration-tests`
> subproject,
The JDBC drivers are currently being pulled in as test-scope dependencies
of the `sql/core` module:
https://github.com/apache/spark/blob/f2fbfa444f6e8d27953ec2d1c0b3abd603c963f9/sql/core/pom.xml#L91
In SBT, these wind up on the Docker JDBC tests' classpath as a transitive
dependency of the `spark-
Hey Josh,
Thanks for helping bringing this up, I have just pushed a WIP PR for
bringing the DB2 tests to be running on Docker, and I have a question about
how the jdbc drivers are actually being setup for the other datasources
(MySQL and PostgreSQL), are these setup directly on the Jenkins slaves
Hey Luciano,
This sounds like a reasonable plan to me. One of my colleagues has written
some Dockerized MySQL testing utilities, so I'll take a peek at those to
see if there are any specifics of their solution that we should adapt for
Spark.
On Wed, Oct 21, 2015 at 1:16 PM, Luciano Resende
wrote