Hello folks, We're starting using Puppet in our production environment and now we're with some preformance issues. For example, we've some large(200MB) recursives directories for puppet's deploy, and that was totally inefficient. (Minimize recursive file serving: http://docs.puppetlabs.com/guides/scaling.html). So we made a test creating our custom debian package (.deb) for our files and libs. Doing that Puppet don't need to recursively check all the file's md5sum. And now we've this catalog :
file { "/tmp/my-custom.deb": ensure => present, source => "puppet:///modules/test/deb/my-custom.deb", } package {"my-custom": require => File['/tmp/my-custom.deb'], ensure => installed, source => "/tmp/my-custom.deb", provider => dpkg, } That way, works great (less than 30 sec), but when updated our custom package puppet just copy the file and do not execute the dpkg to install. How can I achive this goal? And there is a best way to manage large files? Can someone indicate me some references of the best deployment practices (puppet+custom debian or something else) ? Best regards, Sidarta Oliveira -- You received this message because you are subscribed to the Google Groups "Puppet Users" group. To post to this group, send email to puppet-users@googlegroups.com. To unsubscribe from this group, send email to puppet-users+unsubscr...@googlegroups.com. For more options, visit this group at http://groups.google.com/group/puppet-users?hl=en.