Re: No suitable drivers found for postgresql

2015-11-13 Thread Krishna Sangeeth KS
​​ ​Hi,​ I have been trying to do this today at work with impala as the data source​ . I have been getting the same error as well. I am using PySpark APIs with Spark 1.3 version and I was wondering if there is any workaround for Pyspark. I don't think we can use --jars option in PySpark. ​Cheer

Re: Spark Effects of Driver Memory, Executor Memory, Driver Memory Overhead and Executor Memory Overhead on success of job runs

2015-09-01 Thread Krishna Sangeeth KS
Hi Timothy, I think the driver memory in all your examples is more than what is necessary in usual cases and executor memory is quite less. I found this devops talk[1] at spark-summit here to be super useful in understanding few of this configuration details. [1] https://.youtube.com/watch?v=l4Z