Re: How is the order ensured in the jdbc relation provider when inserting data from multiple executors

2016-11-29 Thread Sachith Withana
a new topic under Apache Spark Developers List, email [hidden >> email] <http:///user/SendEmail.jtp?type=node&node=20016&i=1> >> To unsubscribe from Apache Spark Developers List, click here. >> NAML >> <http://apache-spark-developers-list.1001551.n3.nabble.com/template/NamlServlet.jtp?macro=macro_viewer&id=instant_html%21nabble%3Aemail.naml&base=nabble.naml.namespaces.BasicNamespace-nabble.view.web.template.NabbleNamespace-nabble.naml.namespaces.BasicNamespace-nabble.view.web.template.NabbleNamespace-nabble.naml.namespaces.BasicNamespace-nabble.view.web.template.NabbleNamespace-nabble.view.web.template.NodeNamespace&breadcrumbs=notify_subscribers%21nabble%3Aemail.naml-instant_emails%21nabble%3Aemail.naml-send_instant_email%21nabble%3Aemail.naml> >> > > > > -- > Niranda Perera > @n1r44 <https://twitter.com/N1R44> > +94 71 554 8430 <071%20554%208430> > https://www.linkedin.com/in/niranda > https://pythagoreanscript.wordpress.com/ > > -- > View this message in context: Re: How is the order ensured in the jdbc > relation provider when inserting data from multiple executors > <http://apache-spark-developers-list.1001551.n3.nabble.com/How-is-the-order-ensured-in-the-jdbc-relation-provider-when-inserting-data-from-multiple-executors-tp19970p20016.html> > Sent from the Apache Spark Developers List mailing list archive > <http://apache-spark-developers-list.1001551.n3.nabble.com/> at > Nabble.com. > -- Thanks, Sachith Withana

Re: How to convert spark data-frame to datasets?

2016-11-21 Thread Sachith Withana
; Most of Spark ML algorithms requires a dataset to train the model. >> I would like to know how to convert a spark *data-frame* to a *dataset* >> using Java. >> Your support is much appreciated. >> >> Thank you! >> Minudika >> > -- Sachith Withana Softw

Incremental Analysis with Spark

2015-11-25 Thread Sachith Withana
where timestamp in last30days 2. Can we reuse the previously executed data? ex: use the sum of the last 29 days and add the last days worth of data? The second usecase is very tempting as it would improve the performance drastically. Any suggestions would be greatly appreciated. -- Thanks, Sachith Withana

UDF Method overloading

2015-07-30 Thread Sachith Withana
Hi all, Does spark support UDF Method overloading? ex: I want to have an UDF with varying number of arguments multiply(a,b) multiply(a,b,c) Any suggestions? -- Thanks, Sachith Withana

Re: Custom UDFs with zero parameters support

2015-07-28 Thread Sachith Withana
PM, Reynold Xin wrote: > >> Yup - would you be willing to submit a patch to add UDF0? >> >> Should be pretty easy (really just add a new Java class, and then add a >> new function to registerUDF) >> >> >> On Tue, Jul 28, 2015 at 11:36 PM, Sachith Withan

Re: Custom UDFs with zero parameters support

2015-07-28 Thread Sachith Withana
ty parameter in the query such as timestamp < now(' ') or it won't work. On Wed, Jul 29, 2015 at 11:46 AM, Reynold Xin wrote: > We should add UDF0 to it. > > For now, can you just create an one-arg UDF and don't use the argument? > > > On Tue, J

Re: Custom UDFs with zero parameters support

2015-07-28 Thread Sachith Withana
Hi Reynold, I'm implementing the interfaces given here ( https://github.com/apache/spark/tree/master/sql/core/src/main/java/org/apache/spark/sql/api/java ). But currently there is no UDF0 adapter. Any suggestions? I'm new to Spark and any help would be appreciated. -- Thanks, Sachi

Custom UDFs with zero parameters support

2015-07-28 Thread Sachith Withana
? Or is there a way to support custom keywords such as "now" which would act as an custom UDF with no parameters. -- Thanks, Sachith Withana