I figured it out *in case anyone else has this problem in the future.
spark-submit --driver-class-path lib/postgresql-9.4-1201.jdbc4.jar --packages com.databricks:spark-csv_2.10:1.0.3 path/to/my/script.py What I found is that you MUST put the path to your script at the end of the spark-submit command. Also, wildcards in the --driver-class-path work when using pyspark but don't work when using spark-submit. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/spark-submit-working-differently-than-pyspark-when-trying-to-find-external-jars-tp23231p23232.html Sent from the Apache Spark User List mailing list archive at Nabble.com. --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org