RE: How to unsubscribe???

2019-01-17 Thread Junior Alvarez
Hi! Thanks for your info...:-) Unfortunately, I have never received that email you’re talking about (with instructions to double confirm), and it is not in my junk or deleted folders neither.. Who is the sender of that second email??? And is there a way for you to send me that link, so I can

Unsubscribe

2019-01-17 Thread Владимир Курятков
Unsubscribe

Re: SPIP: DataFrame-based Property Graphs, Cypher Queries, and Algorithms

2019-01-17 Thread clarrob
Hi Xiangrui +1. I've been working with data and analytics technologies in the finance industry for many years, and I think that getting a well-established graph query language like Cypher to operate over SparkSQL-conformant property graphs would be relevant for lots of use cases where people wan

UDF error with Spark 2.4 on scala 2.12

2019-01-17 Thread Andrés Ivaldi
Hello I'm having problems with UDF, I was reading a bit about it , and it look's like a closure issue, but I don't know hoy to fix it, it works fine on 2.11. my code for udf definition (I tried several posibilites this is the las one) val o:org.apache.spark.sql.api.java.UDF2[java.sql.Timestam

Question about RDD pipe

2019-01-17 Thread Mkal
Hi, im trying to run an external script on spark using rdd.pipe() and although it runs successfully on standalone, it throws an error on cluster. The error comes from the executors and it's : "Cannot run program "path/to/program": error=2, No such file or directory". Does the external script need

Re: SPIP: DataFrame-based Property Graphs, Cypher Queries, and Algorithms

2019-01-17 Thread dusanz
I support this proposal - great idea, something that's been missing in Spark world. I'm a data architect working primarily in banking, many years of designing and tuning relational database systems, and more recently, wBig Data solutions, often including integration of old and new technologies. The

Re: Question about RDD pipe

2019-01-17 Thread Arun Mahadevan
Yes, the script should be present on all the executor nodes. You can pass your script via spark-submit (e.g. --files script.sh) and then you should be able to refer that (e.g. "./script.sh") in rdd.pipe. - Arun On Thu, 17 Jan 2019 at 14:18, Mkal wrote: > Hi, im trying to run an external script