Hey all,

I have around 1000 html files, I got it using different web crawling programs.
I need to save this and use it as a part of a database.
But all the files have links to cgi programs. All these CGI links are
mentioned as /cgi-bin/foo/foo.pl as path.
I dont have local copy of these programs at the remote servers.
Is there any way to parse html files and add the proper url before
/cgi-bin/foo/foo.pl

My original file :
<tr><td colspan="2">They are:</td></tr>
<tr><td>19 </td><td><a
href="/cgi-bin/lookup_public.pl?ID=10121">10121</a></td></tr>
<tr><td>19 </td><td><a
href="/cgi-bin/Name_lookup_public.pl?Name=Test12">Test12</a></td></tr>
</table><br/>

I need to it as :
<tr><td colspan="2">They are:</td></tr>
<tr><td>19 </td><td><a
href="http://foo.com/cgi-bin/lookup_public.pl?ID=10121";>10121</a></td></tr>
<tr><td>19 </td><td><a
href="http://foo2.com/cgi-bin/Name_lookup_public.pl?Name=Test12";>Test12</a></td></tr>
</table><br/>

Can you please point me towards a module / peice of code to get this done ?
--
Happy Perl Programming to all !!!

-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
http://learn.perl.org/


Reply via email to