On 2014-04-15 at 10:29 -0600, Yves Dorfsman wrote:
> We've been pushing ssh public keys with Ansible, but this is becoming 
> cumbersome:
> 
> - it takes a significant amount of time to do so, this is growing as the
> list of keys is growing (O(n) type of thing)
> - keys only get pushed where somebody does a does a push, which means that
> it becomes somebody job, and we still easily miss new/temporary servers

The time delay for adding new SSH keys should not be an issue.  If you
find that you need to wait on a config deploy for someone's key to be
available, you should revisit how you decide who has access to what, and
plan things out in advance.

If you are running with systems which aren't in sync with what would be
pushed to them by someone doing a full push now, you should revisit your
deployment practices.  The fact that someone has to type a push command
should be a deployment _detail_ and the end result should be the same as
if you had cron jobs permanently syncing content to what's in the SCM
system (excepting only issues around how access to get the SCM data is
authenticated, which presumably is the reason for the manual trigger).

How are you validating what's currently deployed on each machine now,
such that after an incident you can reason about what _was_ on the
machine, instead of the five different states which _might_ have been on
the machine, depending upon who and when someone last pushed?

-Phil
_______________________________________________
Tech mailing list
Tech@lists.lopsa.org
https://lists.lopsa.org/cgi-bin/mailman/listinfo/tech
This list provided by the League of Professional System Administrators
 http://lopsa.org/

Reply via email to