Hi Steve,
I don't think I fully understand your answer. Please pardon my naiveness
regarding the subject. From what I understand, the actual read will happen
in the executor so executor needs access to data lake. In that sense, how
do I make sure that I can programmatically pass azure credentials
Hi,
I have written spark sql job on spark2.0 by using scala . It is just pulling
the data from hive table and add extra columns , remove duplicates and then
write it back to hive again.
In spark ui, it is taking almost 40 minutes to write 400 go of data. Is there
anything that I need to improv
unsubscribe
Charles Bajomo
Operations Director
www.cloudxtiny.co.uk | Precision Technology Consulting Ltd
Registered England & Wales : 07397178
VAT No. : 124 4354 38 GB
Try this link to see how you may connect
https://docs.databricks.com/spark/latest/data-sources/sql-databases.html
Cheers
Jules
Sent from my iPhone
Pardon the dumb thumb typos :)
> On Aug 19, 2017, at 5:27 PM, kant kodali wrote:
>
> Hi Russell,
>
> I went through this
> https://jaceklaskows
Hi Russell,
I went through this
https://jaceklaskowski.gitbooks.io/mastering-apache-spark/spark-sql-thrift-server.html
and I am still a bit confused on what hive is doing in here ? Is there any
example I can look at on how to talk to Spark using Spark SQL JDBC driver
alone and not hive ?
Thanks,
This might help; I’ve built a REST API with livyServer:
https://livy.incubator.apache.org/
From: Steve Loughran
Date: Saturday, August 19, 2017 at 7:05 AM
To: Imtiaz Ahmed
Cc: "user@spark.apache.org"
Subject: Re: How to authenticate to ADLS from within spark job on the fly
On 19 Aug 2017,
For example, the user's bank card number cannot be viewed by an analyst and
replaced by an asterisk. How do you do that in spark?
On 19 Aug 2017, at 02:42, Imtiaz Ahmed
mailto:emtiazah...@gmail.com>> wrote:
Hi All,
I am building a spark library which developers will use when writing their
spark jobs to get access to data on Azure Data Lake. But the authentication
will depend on the dataset they ask for. I need to call
Are you in Gradle or something similar for building ?
> On 19. Aug 2017, at 11:58, Pascal Stammer wrote:
>
> Hi all,
>
> I am writing unit tests for my spark application. In the rest of the project
> I am using log4j2.xml files to configure logging. Now I am running in some
> issues and need
Hi all,
I am writing unit tests for my spark application. In the rest of the project I
am using log4j2.xml files to configure logging. Now I am running in some issues
and need the full executor and driver logs but I can’t get logging working in
my unit tests. Any hint how to configure the loggi
10 matches
Mail list logo