How about accumaltors?

Thanks,
Naresh
www.linkedin.com/in/naresh-dulam
http://hadoopandspark.blogspot.com/



On Thu, Mar 8, 2018 at 12:07 AM Chethan Bhawarlal <
cbhawar...@collectivei.com> wrote:

> Hi Dev,
>
> I am doing spark operations on Rdd level for each row like this,
>
>  private def obj(row: org.apache.spark.sql.Row): Put = {
>
>
>
>     row.schema.fields.foreach(x => {
>
>       x.dataType match {
>
>            case (StringType)    => //some operation
>
>
> so, when I get some empty or garbage value my code fails and I am not able
> to catch the exceptions as these failures are occurring at executors.
>
>
> is there a way I can catch these exceptions and accumulate them and print
> to driver logs?
>
>
> any sample examples provided will be of great help.
>
>
> Thanks,
>
> Chethan.
>
>
>
> Collective[i] dramatically improves sales and marketing performance using
> technology, applications and a revolutionary network designed to provide
> next generation analytics and decision-support directly to business users.
> Our goal is to maximize human potential and minimize mistakes. In most
> cases, the results are astounding. We cannot, however, stop emails from
> sometimes being sent to the wrong person. If you are not the intended
> recipient, please notify us by replying to this email's sender and deleting
> it (and any attachments) permanently from your system. If you are, please
> respect the confidentiality of this communication's contents.

Reply via email to