On Friday, September 28, 2012 10:53:45 AM UTC-5, Jeremy wrote:
>
>
> The use of the "YAML.load(open(args[0]))" call was in fact to support both 
> local and network files. In this case I'm actually giving an authenticated 
> S3 bucket URL to retrieve the file as the engineers releasing the code also 
> upload the deployment YAML file to the S3 bucket. The tarballs that are 
> deployed are also in the S3 bucket and also pass authenticated URLs in the 
> catalog with an expiration equal to the catalog expiration time. I'd like 
> to eventually modify it to include retrieving the deployment file and 
> storing it locally only when it's been modified but want to keep it in S3 
> as it allows my Puppet master to operate as a blackbox that engineers have 
> no access to. If I control the deployment file locally they claim I'm the 
> bottleneck slowing them down so as long as I give them the means to update 
> it and the process flow is error free and only problems encountered are 
> when they screw up the deployment file contents accountability is 
> maintainable.
>


Given your target environment I can imagine why S3 may be attractive, but 
if you yave not already done so then you should investigate whether it 
provides the performance guarantees (and real-life performance) necessary 
for the use to which you're putting it.

Have you considered pulling over the deployment file to the master on a 
periodic basis (such as via cron) so that it can always be local for your 
module?
 

>
> My thought on the "smoking gun" is in having to make the parser function 
> calls to try and determine the unique components and unique versions in the 
> case of a component with multiple versions needing to be deployed. This was 
> the quickest way I could find to get the deployment file format converted 
> and ensure that I only defined a resource once avoiding the duplicate 
> resource definition errors. As a result I'm calling the 2 functions which 
> have to iterate through the entire YAML content merging then sorting for 
> unique values separately. 
>  
>

How big are the real deployment files?  I wouldn't think that parsing and 
processing even moderately large YAML files would be prohibitively 
expensive in itself, especially when compared to the work the master must 
perform to compile all the DSL code.  In any case, you should be able to 
test that against real data by wrapping a test harness around the innards 
of your function.

Cheers,

John

-- 
You received this message because you are subscribed to the Google Groups 
"Puppet Users" group.
To view this discussion on the web visit 
https://groups.google.com/d/msg/puppet-users/-/AHoMhpuyhGkJ.
To post to this group, send email to puppet-users@googlegroups.com.
To unsubscribe from this group, send email to 
puppet-users+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/puppet-users?hl=en.

Reply via email to