Hi,
Have a look here -
https://repost.aws/knowledge-center/spark-driver-logs-emr-cluster.
Usually, you have application logs out-of-the-box in the driver stdout.
It looks like
"s3://aws-logs--us-east-1/elasticmapreduce/j-35PUYZBQVIJNM/containers/application_1572839353552_0008/container
Hi Grisha
This is Great :) It worked thanks alot
I have this requirement , I will be running my spark application on EMR
and build a custom logging to create logs on S3. Any idea what should I do?
or In general if i create a custom log (with my Application name ), where
will logs be generated when
In Java, it expects an array of Columns, so you can simply cast your list
to an array:
array_df.select(fields.toArray(new Column[0]))
On Fri, Dec 29, 2023 at 10:58 PM PRASHANT L wrote:
>
> Team
> I am using Java and want to select columns from Dataframe , columns are
> stored in List
> equiv
Team
I am using Java and want to select columns from Dataframe , columns are
stored in List
equivalent of below scala code
* array_df=array_df.select(fields: _*)*
When I try array_df=array_df.select(fields) , I get error saying Cast to
Column
I am using Spark 3.4