Hi,

On 02.08.2011, at 13:19, Boian Mihailov wrote:

> Hello everyone. I am fairly new to puppet, and i just love it.
> 
> I am trying to distribute websites across my web servers.
> I've tried file copy with recursion but its fairly slow process and i
> believe its not quite right to do it this way.
> 
> Now i am thinking of making tar.gz of each site and distribute it to
> server than extract it there to the webroot.
> 
> My exact question is how to make puppet unarchive this tar.gz only
> when its changed, so its not done every time the puppet resyncs
> 
> something like:
> 
> class pwiki {
>       file { "/srv/www/pwiki.tar.gz":
>           source => "puppet://puppet/files/srv/www/pwiki.tar.gz"
>       }
> 
>       exec { "pwiki.tar.gz":
>               command => "tar zxf pwiki.tar.gz",
>               subscribe => File["/srv/www/pwiki.tar.gz"]
>       }
> }

add the following line to your exec resource:
refreshonly => true,

Regards,

Martin

> 
> -- 
> You received this message because you are subscribed to the Google Groups 
> "Puppet Users" group.
> To post to this group, send email to puppet-users@googlegroups.com.
> To unsubscribe from this group, send email to 
> puppet-users+unsubscr...@googlegroups.com.
> For more options, visit this group at 
> http://groups.google.com/group/puppet-users?hl=en.
> 

-- 
You received this message because you are subscribed to the Google Groups 
"Puppet Users" group.
To post to this group, send email to puppet-users@googlegroups.com.
To unsubscribe from this group, send email to 
puppet-users+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/puppet-users?hl=en.

Reply via email to