looks good to me
thanks for the share
On Wed, Nov 5, 2014 at 5:15 PM, Devopam Mittra wrote:
> hi Nitin,
> Thanks for the vital input around Hadoop Home addition. At times such
> things totally go off the radar when you have customized your own
> environment.
>
> As suggested I have shared this
hi Nitin,
Thanks for the vital input around Hadoop Home addition. At times such
things totally go off the radar when you have customized your own
environment.
As suggested I have shared this on github :
https://github.com/devopam/hadoopHA
apologies if there is any problem on github as I have limit
Good work Devopam Mittra.
*RegardsMuthupandi.K*
Think before you print.
On Wed, Nov 5, 2014 at 12:31 PM, Nitin Pawar
wrote:
> +1
> If you can optionally add hadoop home directory in the script and use that
> in path, it can be used out of the box.
>
> Also can you share this on github
>
>
+1
If you can optionally add hadoop home directory in the script and use that
in path, it can be used out of the box.
Also can you share this on github
On Wed, Nov 5, 2014 at 10:02 AM, Devopam Mittra wrote:
> hi All,
> Please find attached a simple shell script to dynamically determine the
> ac
hi All,
Please find attached a simple shell script to dynamically determine the
active namenode in the HA Cluster and subsequently run the Hive job / query
via Talend OS generated workflows.
It was tried successfully on a HDP2.1 cluster with 2 nn, 7 dn running on
CentOS 6.5.
Each ETL job invokes t