On Wed, Feb 4, 2009 at 2:28 AM, Matt <mattmora...@gmail.com> wrote: > Hi Nigel, > > I gather you run puppet --parseonly for each new file that svn is going to > commit. Do you have your pre-hook to share?
It's actually perforce, not subversion... and is kind of integrated into some custom infrastructure here. Basically it look at each file to be modified in a changelist, and if it's an add or a modify, it runs --parseonly. We used to use puppetmasterd --parseonly which was better as it would catch errors like imports of modules that didn't exist, rather than just syntactic errors, but that started breaking at some point quite a while ago. The best test though is to have machines that puppet against your most unstable branches continuously imho. > > Thanks, > > Matt > > 2009/1/7 Nigel Kersten <nig...@google.com> >> >> We use environments for a release process, so we can test releases before >> pushing them to our stable environments. >> You can also get puppet to check manifests for syntax validity with >> --parseonly, which we and a lot of other people use as commit hooks in >> version control so that at least the syntax is guaranteed to be valid. That >> catches the fat finger errors, and the release process with environments >> lets us test the actual functionality. >> >> On Wed, Jan 7, 2009 at 4:02 AM, Matt <mattmora...@gmail.com> wrote: >>> >>> HI all, >>> >>> First thing - I've been keeping the puppet manifest, configs, >>> functions etc. in svn, but due to a few dodgy checkouts to the >>> puppetmaster (non production) i'd like to get a better process in >>> place. Are people using anything to test the puppet deployments? >>> preferably in a continuous environment. How are you >>> deploying/releasing puppet manifests etc.. to the master? >>> >>> Also a quick puppet question: >>> >>> I use this function to get an application tar file which is specified >>> in node manifest $dist = app-37434-3439493-.tar.gz: >>> >>> exec { "get-app": >>> cwd => "/opt/dist", >>> creates => "/opt/dist/$dist", >>> path => ["/usr/bin", "/usr/sbin"], >>> command => "curl -s -f -o $dist >>> http://$repoUrl/app/$dist", >>> before => Exec["untar-dist"], >>> } >>> >>> The 'creates' value ensures that it doesn't get re-downloaded on every >>> puppet poll. Any new value to $dist works fine, but if $dist becomes >>> an old value then the file already exists, so the exec and subsequent >>> calls are not run. >>> >>> I guess using the file type would cure this, but I don't want the >>> master to serve the file, as I use a similar curl command to get the >>> file from S3 if specified. >>> >>> Thanks, >>> >>> Matt >>> >>> >> >> >> >> -- >> Nigel Kersten >> Systems Administrator >> Tech Lead - MacOps >> >> > > > > > -- Nigel Kersten Systems Administrator Tech Lead - MacOps --~--~---------~--~----~------------~-------~--~----~ You received this message because you are subscribed to the Google Groups "Puppet Users" group. To post to this group, send email to puppet-users@googlegroups.com To unsubscribe from this group, send email to puppet-users+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/puppet-users?hl=en -~----------~----~----~----~------~----~------~--~---