Hi!
Thanks for your info...:-)
Unfortunately, I have never received that email you’re talking about (with
instructions to double confirm), and it is not in my junk or deleted folders
neither..
Who is the sender of that second email??? And is there a way for you to send me
that link, so I can
Unsubscribe
Hi Xiangrui
+1.
I've been working with data and analytics technologies in the finance
industry for many years, and I think that getting a well-established graph
query language like Cypher to operate over SparkSQL-conformant property
graphs would be relevant for lots of use cases where people wan
Hello I'm having problems with UDF, I was reading a bit about it , and it
look's like a closure issue, but I don't know hoy to fix it, it works fine
on 2.11.
my code for udf definition (I tried several posibilites this is the las one)
val o:org.apache.spark.sql.api.java.UDF2[java.sql.Timestam
Hi, im trying to run an external script on spark using rdd.pipe() and
although it runs successfully on standalone, it throws an error on cluster.
The error comes from the executors and it's : "Cannot run program
"path/to/program": error=2, No such file or directory".
Does the external script need
I support this proposal - great idea, something that's been missing in Spark
world. I'm a data architect working primarily in banking, many years of
designing and tuning relational database systems, and more recently, wBig
Data solutions, often including integration of old and new technologies. The
Yes, the script should be present on all the executor nodes.
You can pass your script via spark-submit (e.g. --files script.sh) and then
you should be able to refer that (e.g. "./script.sh") in rdd.pipe.
- Arun
On Thu, 17 Jan 2019 at 14:18, Mkal wrote:
> Hi, im trying to run an external script