Zielfelder, Robert am Mittwoch, 21. Dezember 2005 14.15:
> Greetings,

Hallo Robert

> A little background first:
>
>
>
> I have a bunch of PERL scripts that I use to automate tasks performed by
> CAM software.  I have one HP-UX 10.20 server, a Linux (Red Hat ES 3.0)
> server, and several Linux (Red Hat WS 3.0) workstations.  The HP-UX
> server runs a custom database package that manages the backup of the CAM
> data.  The Linux server is a central storage location for "working" CAM
> data.  The workstations run the CAM application and perform all of the
> edits to the data.
>
> The HP-UX server and Linux workstations all NFS mount an NFS export on
> the Linux server.
>
>
>
> The Problem:
>
>
>
> One of my scripts is run on the workstations and is used to retrieve CAM
> data from the HP-UX 10.20 server.  It runs a command remotely that
> passes information to an application that uncompresses (untar and un
> Gzip) the data and copies it to the Linux NFS share.  After this command
> runs, the script checks for the existence of the files the HP-UX machine
> is placing in the NFS share.  Sometimes, but not always, the existence
> check acts as if the files are not there; even though they are.  The
> real odd part about this is that this only happens about 40% of the time
> the script runs.  The other 60% of the time, the script sees the files
> properly and moves on.  I can't seem to nail down any pattern to this;
> it seems to fail at random.
>
>
>
> The workaround:
[...]
> The questions:
>
> Does anyone have any idea why this might be happening?  I understand
> that this is a rather obscure problem and that it would be difficult to
> know what is happening without seeing the problem first-hand.  
> 
> Could this be a problem with PERL interoperability with Linux and HP-UX?
> Could this be some sort of NFS problem?  Any other ideas?

I guess it's a NFS problem, but cant't help further on it.

> The script:
[...]


Just for the case the problem cannot be solved on NFS level, some remarks to 
the script:

First, the script could never run due to syntax errors (keywords with caps...)

Always use

   use strict;  
   use warnings;

at the top.

> ##--run command on HP-UX machine to get and uncompress CAM data
> $dest_path point too the NFS share from the Linux server--##
>
> `/usr/bin/X11/xterm -e rsh HPSERVER -l user "/star/sys/prog/getrev
> ${dest_path}${job_name} -fld $folder -drw $drawer";
>
>
>
> ##--set vars--##
>
> My $junk1 = `ls ${dest_path}${job_name}`;

No need for a separate process here. Why not something like

   my $file_present= -r ${dest_path}.${job_name};

> My $junk;
>
>
>
> ##--if the file can't be seen go into the loop and look for it until it
> can be seen--##

What you do here is: creating 10'000 new (backtick) processes as fast as 
possible, which is a waste of resources. 
At least you should pause some microseconds, eventually increase 
the pause after every unsuccsessful try.

> If ($junk1 eq "") {
>             For (1 .. 10000) {
>                         $junk = `ls ${dest_path}${job_name}`;
>                         If ($junk) {
>                                     Last();
>                         }
>             }
> }
>
> ##--if it still can't be seen exit--##
>
> If (!$junk || $junk eq "") {
>             Exit (1);
> }


hth, joe

--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
<http://learn.perl.org/> <http://learn.perl.org/first-response>


Reply via email to