I found it in 1.3 documentation lit says something else not percent

public static Column
<https://spark.apache.org/docs/1.3.1/api/java/org/apache/spark/sql/Column.html>
lit(Object literal)

Creates a Column
<https://spark.apache.org/docs/1.3.1/api/java/org/apache/spark/sql/Column.html>
of
literal value.

The passed in object is returned directly if it is already a Column
<https://spark.apache.org/docs/1.3.1/api/java/org/apache/spark/sql/Column.html>.
If the object is a Scala Symbol, it is converted into a Column
<https://spark.apache.org/docs/1.3.1/api/java/org/apache/spark/sql/Column.html>
also.
Otherwise, a new Column
<https://spark.apache.org/docs/1.3.1/api/java/org/apache/spark/sql/Column.html>
is
created to represent the literal value.

On Sat, Oct 10, 2015 at 12:39 AM, <saif.a.ell...@wellsfargo.com> wrote:

> Where can we find other available functions such as lit() ? I can’t find
> lit in the api.
>
>
>
> Thanks
>
>
>
> *From:* Michael Armbrust [mailto:mich...@databricks.com]
> *Sent:* Friday, October 09, 2015 4:04 PM
> *To:* unk1102
> *Cc:* user
> *Subject:* Re: How to calculate percentile of a column of DataFrame?
>
>
>
> You can use callUDF(col("mycol"), lit(0.25)) to call hive UDFs from
> dataframes.
>
>
>
> On Fri, Oct 9, 2015 at 12:01 PM, unk1102 <umesh.ka...@gmail.com> wrote:
>
> Hi how to calculate percentile of a column in a DataFrame? I cant find any
> percentile_approx function in Spark aggregation functions. For e.g. in Hive
> we have percentile_approx and we can use it in the following way
>
> hiveContext.sql("select percentile_approx("mycol",0.25) from myTable);
>
> I can see ntile function but not sure how it is gonna give results same as
> above query please guide.
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/How-to-calculate-percentile-of-a-column-of-DataFrame-tp25000.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>
>

Reply via email to