Greetings,

 

A little background first:

 

I have a bunch of PERL scripts that I use to automate tasks performed by
CAM software.  I have one HP-UX 10.20 server, a Linux (Red Hat ES 3.0)
server, and several Linux (Red Hat WS 3.0) workstations.  The HP-UX
server runs a custom database package that manages the backup of the CAM
data.  The Linux server is a central storage location for "working" CAM
data.  The workstations run the CAM application and perform all of the
edits to the data.  

The HP-UX server and Linux workstations all NFS mount an NFS export on
the Linux server.

 

The Problem:

 

One of my scripts is run on the workstations and is used to retrieve CAM
data from the HP-UX 10.20 server.  It runs a command remotely that
passes information to an application that uncompresses (untar and un
Gzip) the data and copies it to the Linux NFS share.  After this command
runs, the script checks for the existence of the files the HP-UX machine
is placing in the NFS share.  Sometimes, but not always, the existence
check acts as if the files are not there; even though they are.  The
real odd part about this is that this only happens about 40% of the time
the script runs.  The other 60% of the time, the script sees the files
properly and moves on.  I can't seem to nail down any pattern to this;
it seems to fail at random.

 

The workaround:

 

I was able to workaround the problem by putting in for loop and
rechecking the existence until the script sees the files.  Now, by the
time the counter in the for loop reaches 10,000 the script sees the
files and finishes properly.  While the loop is running, I can open
another shell and do an 'ls' and see the files myself, but for some
reason the script won't see them until if feels like it.  Sometimes the
for loop will only have to pass through a few iterations, other times it
iterates several thousand times before it finally sees the files.
Despite the number of iterations, the files are always in the right
place and visible to a manual 'ls' from a shell before the loop starts.

 

The questions:

 

Does anyone have any idea why this might be happening?  I understand
that this is a rather obscure problem and that it would be difficult to
know what is happening without seeing the problem first-hand.  

Could this be a problem with PERL interoperability with Linux and HP-UX?
Could this be some sort of NFS problem?  Any other ideas?

 

 

The script:

 

Here is a snippet of the script I am using with the workaround:

 

...

##--run command on HP-UX machine to get and uncompress CAM data
$dest_path point too the NFS share from the Linux server--##

`/usr/bin/X11/xterm -e rsh HPSERVER -l user "/star/sys/prog/getrev
${dest_path}${job_name} -fld $folder -drw $drawer";

 

##--set vars--##

My $junk1 = `ls ${dest_path}${job_name}`;

My $junk;

 

##--if the file can't be seen go into the loop and look for it until it
can be seen--##

If ($junk1 eq "") {

            For (1 .. 10000) {

                        $junk = `ls ${dest_path}${job_name}`;

                        

                        If ($junk) {

                                    Last();

                        }

            }

}

 

##--if it still can't be seen exit--##

If (!$junk || $junk eq "") {

            Exit (1);

}

 

...

 

 

Thank you for taking the time to read this lengthy post and mull over my
problem.

 

Regards,

 

Robert Zielfelder

 

 

 

 

 

Reply via email to