Try two.

Okay, how about something like the modified script below (untested).  
The key to your problem is to move the open inside the loop to reopen 
that file every time and start over with its lines.

Note:  The one thing that worries me about this script is that it 
pretty much counts on @data having at least as many entries as SCHEMAIN 
does lines.  You know your data and I don't, so if this assumption is 
okay, great.  If not, you might change the second loop to iterate over 
@data, instead of the SCHEMAIN handle.

By the way, I removed the unused $length line below.  I also removed 
the closes, since exiting will do this for you.

Hope this helps and I don't have to go for a third round!  <laughs>  
Just kidding, G'Luck!

James

#!/usr/bin/perl
use strict;
use warnings;

my($datafile, $schemafile, $outputfile) = @ARGV;

open DATAIN,"$datafile" or die "Error opening file:  $!\n";
open XMLOUT, ">>$outputfile" or die "Error opening file:  $!\n";

print XMLOUT "<substance>\n";
while (<DATAIN>) {
        chomp;
        my @data = split /~/, $_;       
        my $count = 0;
        open SCHEMAIN,"$schemafile" or die "Error opening file:  $!\n";
        while (my $newelement = <SCHEMAIN>) {
                chomp $newelement;
                $newelement =~ s/>a/>$data[$count]/;
                print XMLOUT "$newelement\n";
                $count++;       
        }
}

print XMLOUT "</substance>\n";

print "datafile = $datafile and schemafile = $schemafile and outputfile 
= $outputfile";

On Thursday, October 10, 2002, at 02:49  PM, Diethorn, Robert - MBDC 
wrote:

>
>       James, (et.al.)
>
>       see comments below.
>
> -----Original Message-----
> From: James Edward Gray II [mailto:[EMAIL PROTECTED]]
> Sent: Thursday, October 10, 2002 3:36 PM
> To: Diethorn, Robert - MBDC
> Cc: '[EMAIL PROTECTED]'
> Subject: Re: Newbie - reading from files in nested while loops
>
>
> On Thursday, October 10, 2002, at 02:28  PM, Diethorn, Robert - MBDC
> wrote:
>
>>
>>      Ding!
>
> Progress is good.  Don't worry, we'll get it there, eventually.
>
>>
>>      Many thanks James. I was so fixated on the second loop I failed to
>> see that the undef really occurred in the return to the first.
>>      Unfortunately, I think I need to retain the second loop. I need to
>> process the file $schemafile one line at a time, as every line of that
>> file
>> contains exactly one XML tag. Most of the "work" in the script is
>> actually
>> accomplished by the statement $newelement =~ s/>a/>$data[$count]/;  --
>> within that second loop.
>
> Okay, help me understand a little better then.  Are we going to process
> every line in SCHEMAIN for every line in DATAIN?  This would make the
> output contain x copies of the entire SCHEMAIN file, where x is the
> number of lines in DATAIN.  That what your aiming for?
>
>
> ....exactly. Every line in DATAIN is a discrete record from my DB. 
> Each line
> in SCHEMAIN is an XML tag, and I'm attempting to merge the two to 
> create an
> XML file. I think I can show it best graphically:
>
> DATAIN: (~ is the field separator)
>
> xxx~yyy~zzz
> aaa~bbb~ccc
> etc.
>
> SCHEMAIN:
>
> <tag>a</tag>
> <tag2>a</tag2>
> <tag3>a</tag3>
> etc.
>
> Output:
>
> <tag>xxx</tag>
> <tag2>yyy</tag2>
> <tag3>zzz</tag3>
>
> <tag>aaa</tag>
> <tag2>bbb</tag2>
> <tag3>ccc</tag3>
>
> etc.
>
>       By the way, I did also try slurping the entire first file (DATAIN)
> into a multi-dimensional array and then parsing with a for loop around 
> the
> second while loop, but had no luck with that either.
>
>
>
>>      Is there any was I can set a marker for $_ in the original while
>> loop to avoid the undef and return to process the next line? Or am I
>> stuck
>> with slurping the first file into an array in its entirety and
>> processing
>> only the second file with a loop?
>
> You can, but I don't think this is needed.  Let me see what your above
> answer is first, before we go here.
>
> James
>
>
> Rob
>
> -- 
> To unsubscribe, e-mail: [EMAIL PROTECTED]
> For additional commands, e-mail: [EMAIL PROTECTED]
>


-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to