On developer servers we create a sandbox directory for each user and set 
the permissions on the directory so that apache and other developers can 
access it if necessary.  Here's what we use:

                "${home_dir}/${name}/sandboxes":
                        owner   => "$name",
                        group   => "$name",
                        mode    => "0751",
                        ensure  => $ensure ? { absent => absent, default => 
$_sandbox_dir };

As a result, a puppet run takes forever as it throws out the message 
"FileBucket got a duplicate file" for every single duplicate file from our 
codebase, and that's a lot of files.  A puppet run can take well over an 
hour. We didn't specify "recurse" for the directory, yet it seems to be 
recursing through the directory structure anyway.  The more sandboxes our 
developers create, the longer the process takes.  I've tried suppressing 
the recursion by adding "recurse => false", but that made no difference (as 
expected since that's the default).

How can I get puppet to ignore the files in sandboxes directory so our 
puppet runs don't take quite so long?

We're still running puppet 2.6.13 and can't immediately upgrade until we 
are certain we won't break anything in our manifests.

-- 
You received this message because you are subscribed to the Google Groups 
"Puppet Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to puppet-users+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/puppet-users/6c385ec3-0f10-4415-bded-facd118d10d3%40googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.

Reply via email to