Thanks, Ted --

So you think it is a network problem?

Yes, this is an entirely "exclusive" use of the tables.

I agree with all of that. We used to run it locally. This thing has been
out of my hands for 10+ years. I don't know what administrative decisions
have been made in the meantime. For various reasons, I can't easily update
the system (one obvious reason is that I don't have VFP6 installed anywhere
any more -- I could reinstall it -- but that starts messing with "easily").
We need to be able to limp along for some amount of time until they make
some fundamental decisions.

That said -- I can take your essential point, and have them manually copy
the whole shebang onto a local drive, copy the EXE locally, too, and run it
there. Afterwards, they can copy the data back up to the network.

I am happy that they are getting me back in the loop to make business
decisions (leading to technical ones) about how to handle these data.

They are "suffering" from the "if it ain't broke, don't fix it" syndrome.
My application has been bulletproof (patting myself on the back pretty
hard) for over 16 years. If I had only done a crappier job, maybe they
would have paid attention to it sooner. <g>

Ken


On Wed, May 21, 2014 at 1:36 PM, Ted Roche <[email protected]> wrote:

> On 05/21/2014 02:05 PM, Ken Kixmoeller (ProFox) wrote:
> > On Wed, May 21, 2014 at 12:20 PM, Tracy Pearson <[email protected]
> >wrote:
> >
> >> Sounds like a network timeout to me.
> >
> > Thanks. The Im/Export processes hit the database pretty continually
> through
> > this, but if it is a network timeout, any ideas what to do about it?
>
> My experience tells me that you shouldn't try to fix a hardware/network
> problem by coding around it. Let the client know they have a problem and
> they need to fix it.
>
> I'm guessing the big Import/Export routines have to have pretty
> exclusive access to the database so that no one updates stuff in the
> middle of your process. If that's true, why waste time trying to write a
> bunch of separate transactions to the network? Read from the network
> (perhaps even as a batch file, with COPY commands) and do the writing to
> the local drive, then load the result back up. That way you'll get local
> HDD speeds for the intensive processes, and you can use a bulk copy
> command to make one big write at the end of the process. If the bulk
> write fails, you still have the original copy and can retry without
> having to execute the process again.
>
>
>
> --
> Ted Roche & Associates, LLC    http://www.tedroche.com/
>
>
[excessive quoting removed by server]

_______________________________________________
Post Messages to: [email protected]
Subscription Maintenance: http://mail.leafe.com/mailman/listinfo/profox
OT-free version of this list: http://mail.leafe.com/mailman/listinfo/profoxtech
Searchable Archive: http://leafe.com/archives/search/profox
This message: 
http://leafe.com/archives/byMID/profox/CAFyV=Lk62Ui3f1JgS=snloqhb7oy82fwfxey9squl65mzps...@mail.gmail.com
** All postings, unless explicitly stated otherwise, are the opinions of the 
author, and do not constitute legal or medical advice. This statement is added 
to the messages for those lawyers who are too stupid to see the obvious.

Reply via email to